Sep 30 12:21:47 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 12:21:47 crc restorecon[4666]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:47 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 12:21:48 crc restorecon[4666]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 12:21:49 crc kubenswrapper[4672]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 12:21:49 crc kubenswrapper[4672]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 12:21:49 crc kubenswrapper[4672]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 12:21:49 crc kubenswrapper[4672]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 12:21:49 crc kubenswrapper[4672]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 12:21:49 crc kubenswrapper[4672]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.142347 4672 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147792 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147824 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147833 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147844 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147852 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147861 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147869 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147877 4672 feature_gate.go:330] unrecognized feature gate: Example Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147885 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147896 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147907 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147917 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147925 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147934 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147941 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147950 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147958 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147966 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147974 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147981 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147989 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.147996 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148004 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148012 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148021 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148028 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148036 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148051 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148060 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148067 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148075 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148083 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148091 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148099 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148107 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148114 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148122 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148134 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148145 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148153 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148161 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148171 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148179 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148187 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148195 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148204 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148214 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148223 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148233 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148241 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148255 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148288 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148297 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148308 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148317 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148327 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148335 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148342 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148350 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148358 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148366 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148374 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148382 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148390 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148398 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148406 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148414 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148421 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148429 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148437 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.148445 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148582 4672 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148598 4672 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148614 4672 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148626 4672 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148638 4672 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148647 4672 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148659 4672 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148677 4672 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148688 4672 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148698 4672 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148708 4672 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148718 4672 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148727 4672 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148736 4672 flags.go:64] FLAG: --cgroup-root="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148746 4672 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148756 4672 flags.go:64] FLAG: --client-ca-file="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148765 4672 flags.go:64] FLAG: --cloud-config="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148774 4672 flags.go:64] FLAG: --cloud-provider="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148783 4672 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148796 4672 flags.go:64] FLAG: --cluster-domain="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148804 4672 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148814 4672 flags.go:64] FLAG: --config-dir="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148823 4672 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148833 4672 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148854 4672 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148864 4672 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148874 4672 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148883 4672 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148892 4672 flags.go:64] FLAG: --contention-profiling="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148901 4672 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148910 4672 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148919 4672 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148929 4672 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148940 4672 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148949 4672 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148958 4672 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148967 4672 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148976 4672 flags.go:64] FLAG: --enable-server="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148985 4672 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.148998 4672 flags.go:64] FLAG: --event-burst="100" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149008 4672 flags.go:64] FLAG: --event-qps="50" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149017 4672 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149026 4672 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149036 4672 flags.go:64] FLAG: --eviction-hard="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149047 4672 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149056 4672 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149066 4672 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149076 4672 flags.go:64] FLAG: --eviction-soft="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149085 4672 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149093 4672 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149103 4672 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149112 4672 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149121 4672 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149130 4672 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149139 4672 flags.go:64] FLAG: --feature-gates="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149149 4672 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149159 4672 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149168 4672 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149177 4672 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149186 4672 flags.go:64] FLAG: --healthz-port="10248" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149196 4672 flags.go:64] FLAG: --help="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149206 4672 flags.go:64] FLAG: --hostname-override="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149217 4672 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149233 4672 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149258 4672 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149304 4672 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149316 4672 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149327 4672 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149338 4672 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149349 4672 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149360 4672 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149374 4672 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149387 4672 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149395 4672 flags.go:64] FLAG: --kube-reserved="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149405 4672 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149413 4672 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149422 4672 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149431 4672 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149442 4672 flags.go:64] FLAG: --lock-file="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149455 4672 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149465 4672 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149475 4672 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149489 4672 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149499 4672 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149508 4672 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149517 4672 flags.go:64] FLAG: --logging-format="text" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149526 4672 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149536 4672 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149545 4672 flags.go:64] FLAG: --manifest-url="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149554 4672 flags.go:64] FLAG: --manifest-url-header="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149566 4672 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149576 4672 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149587 4672 flags.go:64] FLAG: --max-pods="110" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149597 4672 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149606 4672 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149615 4672 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149624 4672 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149633 4672 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149644 4672 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149653 4672 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149673 4672 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149682 4672 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149691 4672 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149706 4672 flags.go:64] FLAG: --pod-cidr="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149715 4672 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149730 4672 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149739 4672 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149748 4672 flags.go:64] FLAG: --pods-per-core="0" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149757 4672 flags.go:64] FLAG: --port="10250" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149766 4672 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149775 4672 flags.go:64] FLAG: --provider-id="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149788 4672 flags.go:64] FLAG: --qos-reserved="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149797 4672 flags.go:64] FLAG: --read-only-port="10255" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149807 4672 flags.go:64] FLAG: --register-node="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149816 4672 flags.go:64] FLAG: --register-schedulable="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149826 4672 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149841 4672 flags.go:64] FLAG: --registry-burst="10" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149850 4672 flags.go:64] FLAG: --registry-qps="5" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149859 4672 flags.go:64] FLAG: --reserved-cpus="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149868 4672 flags.go:64] FLAG: --reserved-memory="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149880 4672 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149889 4672 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149897 4672 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149906 4672 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149915 4672 flags.go:64] FLAG: --runonce="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149924 4672 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149933 4672 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149942 4672 flags.go:64] FLAG: --seccomp-default="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149951 4672 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149960 4672 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149969 4672 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149978 4672 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149987 4672 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.149996 4672 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150006 4672 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150019 4672 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150028 4672 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150037 4672 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150046 4672 flags.go:64] FLAG: --system-cgroups="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150055 4672 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150068 4672 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150077 4672 flags.go:64] FLAG: --tls-cert-file="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150085 4672 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150100 4672 flags.go:64] FLAG: --tls-min-version="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150109 4672 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150117 4672 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150126 4672 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150134 4672 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150143 4672 flags.go:64] FLAG: --v="2" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150155 4672 flags.go:64] FLAG: --version="false" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150166 4672 flags.go:64] FLAG: --vmodule="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150179 4672 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.150188 4672 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150430 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150441 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150450 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150458 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150466 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150474 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150482 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150490 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150497 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150505 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150512 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150520 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150529 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150537 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150548 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150556 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150563 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150572 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150579 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150587 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150594 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150602 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150614 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150621 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150629 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150636 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150644 4672 feature_gate.go:330] unrecognized feature gate: Example Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150653 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150661 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150669 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150676 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150684 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150691 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150702 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150714 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150723 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150731 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150739 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150747 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150755 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150762 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150770 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150777 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150786 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150793 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150801 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150811 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150818 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150826 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150834 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150842 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150852 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150861 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150870 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150886 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150895 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150902 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150912 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150921 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150929 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150937 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150944 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150954 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150963 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150971 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150979 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150987 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.150994 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.151002 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.151010 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.151018 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.151044 4672 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.168677 4672 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.169226 4672 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169450 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169481 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169494 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169506 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169517 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169528 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169543 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169556 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169569 4672 feature_gate.go:330] unrecognized feature gate: Example Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169580 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169591 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169601 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169611 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169622 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169632 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169645 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169661 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169672 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169682 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169692 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169702 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169712 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169722 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169731 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169745 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169755 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169765 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169774 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169785 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169795 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169805 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169815 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169825 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169834 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169862 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169872 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169883 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169892 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169902 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169912 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169922 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169933 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169942 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169952 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169962 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169972 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169981 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.169991 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170002 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170011 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170021 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170031 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170040 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170050 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170060 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170073 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170085 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170096 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170106 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170116 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170127 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170137 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170147 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170157 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170168 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170177 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170188 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170198 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170208 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170217 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170244 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.170291 4672 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170637 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170673 4672 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170685 4672 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170695 4672 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170706 4672 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170718 4672 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170730 4672 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170740 4672 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170751 4672 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170778 4672 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170788 4672 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170798 4672 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170808 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170817 4672 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170827 4672 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170837 4672 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170846 4672 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170856 4672 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170866 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170876 4672 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170885 4672 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170896 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170906 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170918 4672 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170928 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170938 4672 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170947 4672 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170957 4672 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170966 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170977 4672 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170987 4672 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.170997 4672 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171006 4672 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171016 4672 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171055 4672 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171071 4672 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171082 4672 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171092 4672 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171102 4672 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171114 4672 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171126 4672 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171136 4672 feature_gate.go:330] unrecognized feature gate: Example Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171145 4672 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171156 4672 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171166 4672 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171176 4672 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171185 4672 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171195 4672 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171245 4672 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171257 4672 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171297 4672 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171307 4672 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171317 4672 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171327 4672 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171338 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171347 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171357 4672 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171367 4672 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171377 4672 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171386 4672 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171395 4672 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171405 4672 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171418 4672 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171431 4672 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171443 4672 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171456 4672 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171469 4672 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171479 4672 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171491 4672 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171501 4672 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.171530 4672 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.171547 4672 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.172769 4672 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.178795 4672 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.178956 4672 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.180791 4672 server.go:997] "Starting client certificate rotation" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.180833 4672 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.182612 4672 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-21 08:37:18.38734193 +0000 UTC Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.182698 4672 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1244h15m29.204648263s for next certificate rotation Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.210480 4672 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.213563 4672 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.232423 4672 log.go:25] "Validated CRI v1 runtime API" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.276901 4672 log.go:25] "Validated CRI v1 image API" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.279763 4672 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.284416 4672 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-12-16-58-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.284450 4672 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.315988 4672 manager.go:217] Machine: {Timestamp:2025-09-30 12:21:49.312565006 +0000 UTC m=+0.581802732 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9545f671-f742-45e6-b9f7-3b3404b22825 BootID:a240ac94-c7cd-47b9-85fc-82a9db2c4d67 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:47:4e:1a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:47:4e:1a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:eb:fd:bc Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:37:22:83 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:60:cf:a7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:fe:b1:3d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:56:ed:f7:6e:27:bd Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5e:e6:54:b2:d5:4a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.316810 4672 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.317176 4672 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.317675 4672 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.318016 4672 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.318128 4672 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.318550 4672 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.318628 4672 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.320011 4672 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.320111 4672 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.320506 4672 state_mem.go:36] "Initialized new in-memory state store" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.320683 4672 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.323907 4672 kubelet.go:418] "Attempting to sync node with API server" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.324008 4672 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.324093 4672 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.324167 4672 kubelet.go:324] "Adding apiserver pod source" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.324237 4672 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.328456 4672 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.329536 4672 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.331821 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.331967 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.332044 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.332142 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.332571 4672 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.334860 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.334965 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335044 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335110 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335182 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335259 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335351 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335423 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335501 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335573 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335642 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.335712 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.336687 4672 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.337301 4672 server.go:1280] "Started kubelet" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.338498 4672 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.338491 4672 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.338977 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:49 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.340206 4672 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.346331 4672 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.346407 4672 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.347195 4672 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.347288 4672 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 17:04:11.046795253 +0000 UTC Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.347430 4672 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1300h42m21.699370491s for next certificate rotation Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.347427 4672 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.347362 4672 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.347654 4672 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.348149 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="200ms" Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.349708 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.349855 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.352945 4672 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a0ed447836a71 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 12:21:49.337250417 +0000 UTC m=+0.606488083,LastTimestamp:2025-09-30 12:21:49.337250417 +0000 UTC m=+0.606488083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355145 4672 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355197 4672 factory.go:55] Registering systemd factory Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355213 4672 factory.go:221] Registration of the systemd container factory successfully Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355249 4672 server.go:460] "Adding debug handlers to kubelet server" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355807 4672 factory.go:153] Registering CRI-O factory Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355871 4672 factory.go:221] Registration of the crio container factory successfully Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355954 4672 factory.go:103] Registering Raw factory Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.355992 4672 manager.go:1196] Started watching for new ooms in manager Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.357755 4672 manager.go:319] Starting recovery of all containers Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.368250 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.368345 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.368365 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.368384 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.368401 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370295 4672 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370336 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370389 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370412 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370482 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370507 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370529 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370575 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370593 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370672 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370691 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370739 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370774 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370819 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370839 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370857 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370927 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.370955 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371000 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371018 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371036 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371079 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371102 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371122 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371168 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371190 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371208 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371253 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371302 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371319 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371336 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371381 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371403 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371420 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371469 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371493 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371513 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371559 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371582 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371599 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371647 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371666 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371682 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371731 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371751 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371769 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371785 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371832 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371856 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371905 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371929 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371948 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.371964 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372011 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372028 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372044 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372086 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372104 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372120 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372138 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372183 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372203 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372219 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372258 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372293 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372310 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372357 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372375 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372393 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372436 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372454 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372479 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372522 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372548 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372580 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372631 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372652 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372669 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372715 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372733 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372750 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372797 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372818 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372835 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372880 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372899 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372916 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372934 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.372984 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373000 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373018 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373109 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373125 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373174 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373192 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373208 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373250 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373301 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373320 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373342 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373394 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373417 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373435 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373485 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373507 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373554 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373577 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373600 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373646 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373668 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373686 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373734 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373753 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373769 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373841 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373861 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373909 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373930 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373946 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.373992 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374014 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374031 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374073 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374097 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374114 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374134 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374151 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374194 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374212 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374234 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374252 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374304 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374323 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374342 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374359 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374378 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374397 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374444 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374463 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374485 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374539 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374558 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374602 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374621 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374642 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374690 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374708 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374726 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374744 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374794 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374811 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374830 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374880 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374897 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374943 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374967 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.374986 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375031 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375050 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375067 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375083 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375135 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375154 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375198 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375222 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375253 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375305 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375326 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375344 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375389 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375412 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375433 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375496 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375520 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375537 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375583 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375602 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375620 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375637 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375683 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375704 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375721 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375766 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375783 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375798 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375841 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375860 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375880 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375933 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375952 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375970 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.375987 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376033 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376052 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376068 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376116 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376139 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376157 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376203 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376225 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376243 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376483 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376513 4672 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376532 4672 reconstruct.go:97] "Volume reconstruction finished" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.376664 4672 reconciler.go:26] "Reconciler: start to sync state" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.383780 4672 manager.go:324] Recovery completed Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.398574 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.402695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.402942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.403052 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.406069 4672 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.406098 4672 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.406121 4672 state_mem.go:36] "Initialized new in-memory state store" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.413681 4672 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.415553 4672 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.415615 4672 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.415732 4672 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.415809 4672 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.417547 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.417603 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.425216 4672 policy_none.go:49] "None policy: Start" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.426255 4672 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.426316 4672 state_mem.go:35] "Initializing new in-memory state store" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.448367 4672 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.482738 4672 manager.go:334] "Starting Device Plugin manager" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.482835 4672 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.482859 4672 server.go:79] "Starting device plugin registration server" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.483629 4672 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.483730 4672 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.484208 4672 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.484391 4672 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.484428 4672 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.497089 4672 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.516343 4672 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.516457 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.517826 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.517853 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.517915 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518012 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518175 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518217 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518787 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518959 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.518995 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.519041 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.519067 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.519042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.519150 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520007 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520036 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520014 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520094 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520318 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520562 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.520602 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.521574 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.521609 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.521625 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.521787 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.521876 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.521911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.521922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.522077 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.522217 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523337 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523549 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523580 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523599 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523606 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.523651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.524490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.524546 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.524566 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.549126 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="400ms" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580255 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580344 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580393 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580413 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580488 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580628 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580726 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580792 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580849 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580887 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.580974 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.581048 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.581099 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.581158 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.581218 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.585103 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.587334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.587397 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.587419 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.587467 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.588114 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683161 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683323 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683488 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683520 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683507 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683581 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683627 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683626 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683672 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683706 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683774 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683816 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683923 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683851 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683925 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.683833 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684025 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684065 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684094 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684126 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684131 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684164 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684194 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684193 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684231 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684183 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684300 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684303 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.684417 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.788888 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.790855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.790916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.790929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.790964 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.791760 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.844408 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.850411 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.867823 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.892963 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: I0930 12:21:49.896280 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.906921 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c8caeac1f85f49fcafe94d5a80e27c11af6ec463692027e324720a83d737684d WatchSource:0}: Error finding container c8caeac1f85f49fcafe94d5a80e27c11af6ec463692027e324720a83d737684d: Status 404 returned error can't find the container with id c8caeac1f85f49fcafe94d5a80e27c11af6ec463692027e324720a83d737684d Sep 30 12:21:49 crc kubenswrapper[4672]: W0930 12:21:49.911004 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7d3889746046bed9347728d73f97f8bd75e26edc814842b0df5826766d0ea219 WatchSource:0}: Error finding container 7d3889746046bed9347728d73f97f8bd75e26edc814842b0df5826766d0ea219: Status 404 returned error can't find the container with id 7d3889746046bed9347728d73f97f8bd75e26edc814842b0df5826766d0ea219 Sep 30 12:21:49 crc kubenswrapper[4672]: E0930 12:21:49.950890 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="800ms" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.192031 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.194115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.194171 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.194189 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.194233 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 12:21:50 crc kubenswrapper[4672]: E0930 12:21:50.194629 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Sep 30 12:21:50 crc kubenswrapper[4672]: W0930 12:21:50.290194 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:50 crc kubenswrapper[4672]: E0930 12:21:50.290364 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.340679 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.427436 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ca1b3d80f72fc473f6f53cc1efcd845ad9b031d7162f2b7dd461149d9860097b"} Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.429078 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"805ed095385b798d73ae30f16db3b7112c96b1054e6593608725b14ceeb94c38"} Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.430515 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7d3889746046bed9347728d73f97f8bd75e26edc814842b0df5826766d0ea219"} Sep 30 12:21:50 crc kubenswrapper[4672]: W0930 12:21:50.431569 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:50 crc kubenswrapper[4672]: E0930 12:21:50.431713 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.432782 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c8caeac1f85f49fcafe94d5a80e27c11af6ec463692027e324720a83d737684d"} Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.435553 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dce65bdbc0a84b1db4109c6f5a124b2b724743e6fb051483cef46dfd5c63c015"} Sep 30 12:21:50 crc kubenswrapper[4672]: E0930 12:21:50.752103 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="1.6s" Sep 30 12:21:50 crc kubenswrapper[4672]: W0930 12:21:50.793406 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:50 crc kubenswrapper[4672]: E0930 12:21:50.793538 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.995113 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.996473 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.996535 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.996555 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:50 crc kubenswrapper[4672]: I0930 12:21:50.996607 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 12:21:50 crc kubenswrapper[4672]: E0930 12:21:50.997337 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Sep 30 12:21:51 crc kubenswrapper[4672]: W0930 12:21:51.008889 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:51 crc kubenswrapper[4672]: E0930 12:21:51.008976 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.340510 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.441131 4672 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5" exitCode=0 Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.441224 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.441358 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.443120 4672 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3" exitCode=0 Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.443196 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.443359 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.444090 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.444128 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.444146 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.445851 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.445933 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.445962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.448552 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.448594 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.448610 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.448630 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.448567 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.450465 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.450504 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.450520 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.452956 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1" exitCode=0 Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.453079 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.453236 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.454658 4672 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0" exitCode=0 Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.454704 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0"} Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.454901 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.454950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.455006 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.455032 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.455601 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.455624 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.455634 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.459444 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.460134 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.460156 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:51 crc kubenswrapper[4672]: I0930 12:21:51.460166 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:52 crc kubenswrapper[4672]: W0930 12:21:52.287068 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:52 crc kubenswrapper[4672]: E0930 12:21:52.287564 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.340404 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:52 crc kubenswrapper[4672]: E0930 12:21:52.352767 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="3.2s" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.459461 4672 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c" exitCode=0 Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.459521 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.459685 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.461158 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.461204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.461225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.461569 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f2424d1bc8a625cebd192f83733dc4bc01056e6166c918d21ded15aa3843b34d"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.461674 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.462688 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.462717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.462729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.464837 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.464895 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.464914 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.464864 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.466124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.466156 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.466168 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.468256 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.468366 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.468389 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.468389 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a"} Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.469474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.469578 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.469740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.597420 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.598554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.598589 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.598599 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.598620 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 12:21:52 crc kubenswrapper[4672]: E0930 12:21:52.598986 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Sep 30 12:21:52 crc kubenswrapper[4672]: I0930 12:21:52.967548 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.339968 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:53 crc kubenswrapper[4672]: W0930 12:21:53.382664 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:53 crc kubenswrapper[4672]: E0930 12:21:53.382966 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.473207 4672 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8" exitCode=0 Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.473318 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8"} Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.473370 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.474325 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.474361 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.474378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.478853 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c"} Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.478889 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.478970 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.479009 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.479034 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.478889 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d"} Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.479034 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.480704 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.480736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.480747 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.480759 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.480784 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.480798 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.481021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.481040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.481048 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.481099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.481129 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:53 crc kubenswrapper[4672]: I0930 12:21:53.481145 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:53 crc kubenswrapper[4672]: W0930 12:21:53.799802 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:53 crc kubenswrapper[4672]: E0930 12:21:53.799910 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:53 crc kubenswrapper[4672]: W0930 12:21:53.912180 4672 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Sep 30 12:21:53 crc kubenswrapper[4672]: E0930 12:21:53.912350 4672 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.482754 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.487045 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c" exitCode=255 Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.487098 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c"} Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.487242 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.489458 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.489500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.489517 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.490110 4672 scope.go:117] "RemoveContainer" containerID="20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c" Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.494750 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04"} Sep 30 12:21:54 crc kubenswrapper[4672]: I0930 12:21:54.494793 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f"} Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.395133 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.499101 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.501490 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be"} Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.501597 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.501691 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.502839 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.502923 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.502939 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.505906 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d"} Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.505943 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14"} Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.505958 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a"} Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.506365 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.507509 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.507545 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.507560 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.799816 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.802536 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.802607 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.802651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.802703 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.968653 4672 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 12:21:55 crc kubenswrapper[4672]: I0930 12:21:55.968781 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.510516 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.510643 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.510815 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.512445 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.512498 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.512517 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.513091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.513164 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.513184 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.687801 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.699082 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.699460 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.702485 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.702563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:56 crc kubenswrapper[4672]: I0930 12:21:56.702586 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.514236 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.514412 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.516662 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.516701 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.516710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.517708 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.517962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.518110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:57 crc kubenswrapper[4672]: I0930 12:21:57.663037 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:21:58 crc kubenswrapper[4672]: I0930 12:21:58.517867 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:58 crc kubenswrapper[4672]: I0930 12:21:58.519484 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:58 crc kubenswrapper[4672]: I0930 12:21:58.519554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:58 crc kubenswrapper[4672]: I0930 12:21:58.519577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.065932 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.066119 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.067492 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.067529 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.067543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:59 crc kubenswrapper[4672]: E0930 12:21:59.497244 4672 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.506925 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.507195 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.508597 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.508671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.508693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.744500 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.744826 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.747471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.747546 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.747564 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:21:59 crc kubenswrapper[4672]: I0930 12:21:59.752627 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:22:00 crc kubenswrapper[4672]: I0930 12:22:00.523127 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:22:00 crc kubenswrapper[4672]: I0930 12:22:00.524955 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:00 crc kubenswrapper[4672]: I0930 12:22:00.525013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:00 crc kubenswrapper[4672]: I0930 12:22:00.525041 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:00 crc kubenswrapper[4672]: I0930 12:22:00.529442 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:22:01 crc kubenswrapper[4672]: I0930 12:22:01.525800 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:22:01 crc kubenswrapper[4672]: I0930 12:22:01.528369 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:01 crc kubenswrapper[4672]: I0930 12:22:01.528442 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:01 crc kubenswrapper[4672]: I0930 12:22:01.528471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:02 crc kubenswrapper[4672]: I0930 12:22:02.960034 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 12:22:02 crc kubenswrapper[4672]: I0930 12:22:02.961163 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:22:02 crc kubenswrapper[4672]: I0930 12:22:02.962925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:02 crc kubenswrapper[4672]: I0930 12:22:02.962988 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:02 crc kubenswrapper[4672]: I0930 12:22:02.963011 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:04 crc kubenswrapper[4672]: I0930 12:22:04.341312 4672 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 12:22:05 crc kubenswrapper[4672]: I0930 12:22:05.498176 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 12:22:05 crc kubenswrapper[4672]: I0930 12:22:05.498267 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 12:22:05 crc kubenswrapper[4672]: I0930 12:22:05.503190 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 12:22:05 crc kubenswrapper[4672]: I0930 12:22:05.503269 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 12:22:05 crc kubenswrapper[4672]: I0930 12:22:05.969081 4672 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 12:22:05 crc kubenswrapper[4672]: I0930 12:22:05.969309 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 12:22:07 crc kubenswrapper[4672]: I0930 12:22:07.666603 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:22:07 crc kubenswrapper[4672]: I0930 12:22:07.666775 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:22:07 crc kubenswrapper[4672]: I0930 12:22:07.668036 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:07 crc kubenswrapper[4672]: I0930 12:22:07.668119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:07 crc kubenswrapper[4672]: I0930 12:22:07.668139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:07 crc kubenswrapper[4672]: I0930 12:22:07.671267 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:22:08 crc kubenswrapper[4672]: I0930 12:22:08.544052 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:22:08 crc kubenswrapper[4672]: I0930 12:22:08.545194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:08 crc kubenswrapper[4672]: I0930 12:22:08.545231 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:08 crc kubenswrapper[4672]: I0930 12:22:08.545252 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:09 crc kubenswrapper[4672]: E0930 12:22:09.498362 4672 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 12:22:10 crc kubenswrapper[4672]: E0930 12:22:10.495088 4672 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.498517 4672 trace.go:236] Trace[1928085251]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 12:21:59.234) (total time: 11264ms): Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1928085251]: ---"Objects listed" error: 11264ms (12:22:10.498) Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1928085251]: [11.264116083s] [11.264116083s] END Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.499154 4672 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.498855 4672 trace.go:236] Trace[1467572773]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 12:21:56.027) (total time: 14470ms): Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1467572773]: ---"Objects listed" error: 14470ms (12:22:10.498) Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1467572773]: [14.470935325s] [14.470935325s] END Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.499398 4672 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.499109 4672 trace.go:236] Trace[1632972129]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 12:21:57.857) (total time: 12642ms): Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1632972129]: ---"Objects listed" error: 12641ms (12:22:10.498) Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1632972129]: [12.642009697s] [12.642009697s] END Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.499505 4672 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 12:22:10 crc kubenswrapper[4672]: E0930 12:22:10.499693 4672 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.500149 4672 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.500889 4672 trace.go:236] Trace[1329424747]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 12:21:58.835) (total time: 11665ms): Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1329424747]: ---"Objects listed" error: 11665ms (12:22:10.500) Sep 30 12:22:10 crc kubenswrapper[4672]: Trace[1329424747]: [11.665818637s] [11.665818637s] END Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.501044 4672 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.682954 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38316->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.683042 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38316->192.168.126.11:17697: read: connection reset by peer" Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.683585 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 30 12:22:10 crc kubenswrapper[4672]: I0930 12:22:10.683663 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.163998 4672 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.164087 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.337388 4672 apiserver.go:52] "Watching apiserver" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.341987 4672 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.342260 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.342659 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.342830 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.342889 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.342970 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.343067 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.343108 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.343189 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.343197 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.343246 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.344988 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.345168 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.345323 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.345492 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.345624 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.345768 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.346995 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.346998 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.347358 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.348684 4672 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.373301 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.386448 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405735 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405789 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405818 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405843 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405901 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405921 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405942 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405958 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.405982 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406008 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406026 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406048 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406068 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406087 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406109 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406129 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406151 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406180 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406200 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406226 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406249 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406286 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406307 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406330 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406355 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406374 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406397 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406417 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406441 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406461 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406479 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406497 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406516 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406532 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406555 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406577 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406601 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406648 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406672 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406693 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406716 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406735 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406753 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406773 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406795 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406816 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406836 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406861 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406881 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406898 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406914 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406207 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406303 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406424 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406500 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406551 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406587 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406764 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406760 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406920 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.406931 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407097 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407124 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407117 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407145 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407173 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407198 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407218 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407236 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407253 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407293 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407283 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407325 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407314 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407415 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407423 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407450 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407483 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407499 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407511 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407539 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407559 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407566 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407599 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407624 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407648 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407663 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407670 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407733 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407763 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407792 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407803 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407818 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407846 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407873 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407900 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407925 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407952 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407978 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408002 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408025 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408051 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408075 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408097 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408123 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408146 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408177 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408200 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408224 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408248 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408297 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408323 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408346 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408368 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408402 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408439 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408466 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408495 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408541 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408565 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408598 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408624 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408650 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408677 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408703 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408725 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408752 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408781 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408812 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408836 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408867 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408891 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408916 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408941 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408965 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408989 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409012 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409038 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409062 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409086 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409109 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409133 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409156 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409182 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409207 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409230 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409257 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409304 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409329 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409355 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409379 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409422 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409447 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409471 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409495 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409521 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409545 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409571 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409594 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409617 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409641 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409665 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409693 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409715 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409739 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409763 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409786 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409809 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409834 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409857 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409889 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409915 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409941 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409963 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409987 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410011 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410037 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410130 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410159 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410184 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410209 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410231 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410256 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411162 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411202 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411229 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411256 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411302 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411327 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411353 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411377 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411411 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411445 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411474 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411501 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411527 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411554 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411579 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411607 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411633 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411659 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411686 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411715 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411743 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411768 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411794 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411823 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411850 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411877 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411905 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411934 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411963 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412023 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412055 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412087 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412122 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412154 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412187 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412216 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412245 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412301 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412334 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412361 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412390 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412420 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412445 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412535 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412579 4672 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412597 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412612 4672 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412626 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412641 4672 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412656 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412670 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412684 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412699 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412714 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412729 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412745 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412761 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412776 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412790 4672 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.412803 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.415320 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407873 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407913 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.422348 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.407921 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408144 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408444 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408470 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408564 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408676 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408735 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408771 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408850 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408875 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.408918 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409125 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409142 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409425 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409501 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.409849 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410227 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410513 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410571 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410753 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.410743 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411003 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411695 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.411910 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.412903 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.414481 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.414769 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.415028 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.415863 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.416063 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.416981 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.417070 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.417082 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.417152 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.417610 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.417628 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.417900 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.417962 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.418021 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.418042 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.418987 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.419347 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.422638 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.419399 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.419731 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.419850 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.419868 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.420022 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.420158 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.420178 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.421672 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.421683 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.421902 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.422039 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.422142 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.422393 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.422406 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.422533 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.419538 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.423016 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.423154 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.423505 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.423536 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.423837 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.424605 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.424956 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.425790 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.427525 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.428164 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.428546 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.428889 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.429144 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.430448 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.433941 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.435041 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.435237 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.435400 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:11.935354734 +0000 UTC m=+23.204592380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.435597 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.436181 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.436360 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.436606 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.436695 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.438415 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.438519 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.438876 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.438791 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.439141 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.439307 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.439455 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.439850 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.439926 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.439951 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.440051 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.440133 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.440487 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.440493 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.440624 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.440749 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.440868 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.441033 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.441126 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.441179 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.444369 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:22:11.943986111 +0000 UTC m=+23.213223747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.448647 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.448999 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.449446 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.450179 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.450719 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.450815 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.451227 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.451681 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.451698 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.451675 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.452040 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.452292 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.452558 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.452675 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.453089 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.453239 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.453234 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.453322 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.453704 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.453943 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.454144 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.454356 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.454453 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.454684 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.454798 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:11.954765802 +0000 UTC m=+23.224003638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.455186 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.456751 4672 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.456891 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.457637 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.457784 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.459381 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.459454 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.459642 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.459731 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.459762 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.459969 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.461287 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.465979 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.467370 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.467716 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.469773 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.471139 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.471831 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.472365 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.472611 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.472835 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.472934 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.473128 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:11.973094612 +0000 UTC m=+23.242332448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.472672 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.473681 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.474496 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.474584 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.475149 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.475313 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.476181 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.477727 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.478049 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.478900 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.479519 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.479924 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.480041 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.480664 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.480699 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.480713 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.480748 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.480768 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:11.980750135 +0000 UTC m=+23.249987781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.480815 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.480761 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.480839 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.481207 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.481351 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.481405 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.481429 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.481576 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.481654 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.482137 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.482819 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.483927 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.484208 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.484248 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.484481 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.484636 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.485123 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.485480 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.485234 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.486448 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.486883 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.488779 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.490729 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.491544 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.498790 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.499822 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.500156 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.500896 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.502960 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.511538 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.513606 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.513675 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514073 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514251 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514339 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514389 4672 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514396 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514405 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514448 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514461 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514478 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514494 4672 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514507 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514520 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514532 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514544 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514555 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514566 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514577 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514587 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514597 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514606 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514616 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514626 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514636 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514647 4672 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514658 4672 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514668 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514505 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514678 4672 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514739 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514758 4672 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514773 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514787 4672 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514801 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514815 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514828 4672 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514842 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514856 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514869 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514883 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514896 4672 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514908 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514921 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514935 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514948 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514962 4672 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.514975 4672 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515029 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515043 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515057 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515069 4672 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515082 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515095 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515108 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515120 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515132 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515145 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515157 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515170 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515183 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515197 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515209 4672 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515227 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515241 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515254 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515298 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515311 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515324 4672 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515338 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515350 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515363 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515376 4672 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515389 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515403 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515417 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515430 4672 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515443 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515457 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515473 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515484 4672 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515497 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515509 4672 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515529 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515565 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515581 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515602 4672 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515624 4672 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515654 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515668 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515693 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515725 4672 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515744 4672 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515760 4672 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515787 4672 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515814 4672 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515826 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515840 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515862 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515883 4672 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515904 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515915 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515939 4672 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515966 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515983 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.515996 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516010 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516022 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516035 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516050 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516064 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516076 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516089 4672 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516103 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516116 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516130 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516145 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516159 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516172 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516189 4672 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516202 4672 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516217 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516229 4672 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516241 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516254 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516289 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516305 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516319 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516332 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516343 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516358 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516371 4672 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516384 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516398 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516412 4672 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516425 4672 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516437 4672 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516451 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516463 4672 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516478 4672 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516492 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516521 4672 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516535 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516549 4672 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516561 4672 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516574 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516587 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516602 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516616 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516630 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516645 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516657 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516670 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516685 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516698 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516710 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516724 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516737 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516750 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516767 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516780 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516792 4672 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516807 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516823 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516837 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516850 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516863 4672 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516876 4672 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.516889 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.517059 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.517722 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.517764 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.520544 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.522083 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.522899 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.524226 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.524738 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.525233 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.531829 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.535365 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.535808 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.536096 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.537198 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.537950 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.538028 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.538782 4672 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.538949 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.543989 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.545810 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.546528 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.548068 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.551236 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.552744 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.553232 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.553435 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.553830 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.553996 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.555108 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.555826 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.556747 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be" exitCode=255 Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.556900 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.557563 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.558615 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.559215 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.560076 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.560645 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.561635 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.562498 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.563466 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.563951 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.564830 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.565375 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.565986 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.566929 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.567435 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be"} Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.567503 4672 scope.go:117] "RemoveContainer" containerID="20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.580095 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.581577 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.582568 4672 scope.go:117] "RemoveContainer" containerID="a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be" Sep 30 12:22:11 crc kubenswrapper[4672]: E0930 12:22:11.582823 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.593407 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.604883 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.614469 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617410 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617450 4672 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617461 4672 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617473 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617483 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617492 4672 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617501 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617524 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617533 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617541 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617550 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617558 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617566 4672 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617575 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617583 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.617606 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.626530 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.636930 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.656357 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.662340 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 12:22:11 crc kubenswrapper[4672]: W0930 12:22:11.669599 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-14ca5b66e47ec59784bebc2b270276bf3b8102f048f1b0a6683119e0f5fd0e09 WatchSource:0}: Error finding container 14ca5b66e47ec59784bebc2b270276bf3b8102f048f1b0a6683119e0f5fd0e09: Status 404 returned error can't find the container with id 14ca5b66e47ec59784bebc2b270276bf3b8102f048f1b0a6683119e0f5fd0e09 Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.671880 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 12:22:11 crc kubenswrapper[4672]: W0930 12:22:11.675780 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-75f45a0d112cd4c85750be381b8265cb418a77435dbecb989ce23304d121cebf WatchSource:0}: Error finding container 75f45a0d112cd4c85750be381b8265cb418a77435dbecb989ce23304d121cebf: Status 404 returned error can't find the container with id 75f45a0d112cd4c85750be381b8265cb418a77435dbecb989ce23304d121cebf Sep 30 12:22:11 crc kubenswrapper[4672]: W0930 12:22:11.694206 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d798eb841f1090e6f15d91770c9cae1f06f2ae52e0cc152bea7fb286454fd91e WatchSource:0}: Error finding container d798eb841f1090e6f15d91770c9cae1f06f2ae52e0cc152bea7fb286454fd91e: Status 404 returned error can't find the container with id d798eb841f1090e6f15d91770c9cae1f06f2ae52e0cc152bea7fb286454fd91e Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.694230 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bh5lq"] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.694610 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dpqrd"] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.694788 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.694833 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8q82q"] Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.695020 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.695222 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.698849 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.699247 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.699325 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.699609 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.699700 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.699783 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.699796 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.699815 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.700074 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.700206 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.700331 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.703026 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.703967 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.710108 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.721671 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.734008 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.744285 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.751973 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.764201 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"message\\\":\\\"W0930 12:21:53.138223 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 12:21:53.139223 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759234913 cert, and key in /tmp/serving-cert-529706298/serving-signer.crt, /tmp/serving-cert-529706298/serving-signer.key\\\\nI0930 12:21:53.505845 1 observer_polling.go:159] Starting file observer\\\\nW0930 12:21:53.510913 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 12:21:53.511041 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:21:53.511745 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-529706298/tls.crt::/tmp/serving-cert-529706298/tls.key\\\\\\\"\\\\nF0930 12:21:53.766381 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.773697 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.787437 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.798203 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.814610 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818616 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-multus-certs\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818646 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-hostroot\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818659 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-conf-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818674 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-etc-kubernetes\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818688 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96949d92-2365-41eb-8f62-c264c8328c02-hosts-file\") pod \"node-resolver-bh5lq\" (UID: \"96949d92-2365-41eb-8f62-c264c8328c02\") " pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818704 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-socket-dir-parent\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818720 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-cni-multus\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818735 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-kubelet\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-cni-binary-copy\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818767 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-netns\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818782 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95794952-d817-48f2-8956-f7a310f8d1d9-proxy-tls\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818797 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkb7v\" (UniqueName: \"kubernetes.io/projected/96949d92-2365-41eb-8f62-c264c8328c02-kube-api-access-lkb7v\") pod \"node-resolver-bh5lq\" (UID: \"96949d92-2365-41eb-8f62-c264c8328c02\") " pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818826 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-cni-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818852 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-system-cni-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818866 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-os-release\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818883 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-k8s-cni-cncf-io\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818896 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-cni-bin\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818910 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pn7q\" (UniqueName: \"kubernetes.io/projected/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-kube-api-access-2pn7q\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818925 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95794952-d817-48f2-8956-f7a310f8d1d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818946 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-cnibin\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818960 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-daemon-config\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.818973 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnk64\" (UniqueName: \"kubernetes.io/projected/95794952-d817-48f2-8956-f7a310f8d1d9-kube-api-access-rnk64\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.819000 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95794952-d817-48f2-8956-f7a310f8d1d9-rootfs\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.825710 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.839550 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.854535 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"message\\\":\\\"W0930 12:21:53.138223 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 12:21:53.139223 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759234913 cert, and key in /tmp/serving-cert-529706298/serving-signer.crt, /tmp/serving-cert-529706298/serving-signer.key\\\\nI0930 12:21:53.505845 1 observer_polling.go:159] Starting file observer\\\\nW0930 12:21:53.510913 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 12:21:53.511041 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:21:53.511745 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-529706298/tls.crt::/tmp/serving-cert-529706298/tls.key\\\\\\\"\\\\nF0930 12:21:53.766381 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.867111 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.877410 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.886928 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.894440 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.903869 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920299 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95794952-d817-48f2-8956-f7a310f8d1d9-rootfs\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920372 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-multus-certs\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920397 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96949d92-2365-41eb-8f62-c264c8328c02-hosts-file\") pod \"node-resolver-bh5lq\" (UID: \"96949d92-2365-41eb-8f62-c264c8328c02\") " pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920417 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95794952-d817-48f2-8956-f7a310f8d1d9-rootfs\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920420 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-socket-dir-parent\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920488 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-hostroot\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-conf-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920551 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-etc-kubernetes\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920568 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-kubelet\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920608 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-cni-multus\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920617 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96949d92-2365-41eb-8f62-c264c8328c02-hosts-file\") pod \"node-resolver-bh5lq\" (UID: \"96949d92-2365-41eb-8f62-c264c8328c02\") " pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920620 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-socket-dir-parent\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920646 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-cni-multus\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920632 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkb7v\" (UniqueName: \"kubernetes.io/projected/96949d92-2365-41eb-8f62-c264c8328c02-kube-api-access-lkb7v\") pod \"node-resolver-bh5lq\" (UID: \"96949d92-2365-41eb-8f62-c264c8328c02\") " pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920674 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-conf-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920670 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-etc-kubernetes\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920716 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-kubelet\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920690 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-cni-binary-copy\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920779 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-hostroot\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920461 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-multus-certs\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920837 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-netns\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920884 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95794952-d817-48f2-8956-f7a310f8d1d9-proxy-tls\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920889 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-netns\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920930 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-cni-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920953 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-cni-bin\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.920980 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-system-cni-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921001 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-os-release\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921020 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-k8s-cni-cncf-io\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921039 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pn7q\" (UniqueName: \"kubernetes.io/projected/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-kube-api-access-2pn7q\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921058 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95794952-d817-48f2-8956-f7a310f8d1d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921088 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-cnibin\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921093 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-var-lib-cni-bin\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921115 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-daemon-config\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921141 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnk64\" (UniqueName: \"kubernetes.io/projected/95794952-d817-48f2-8956-f7a310f8d1d9-kube-api-access-rnk64\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921149 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-cni-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921143 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-system-cni-dir\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921147 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-host-run-k8s-cni-cncf-io\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921212 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-cnibin\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921205 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-os-release\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921934 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95794952-d817-48f2-8956-f7a310f8d1d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.921999 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-cni-binary-copy\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.922157 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-multus-daemon-config\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.926947 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95794952-d817-48f2-8956-f7a310f8d1d9-proxy-tls\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.936897 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkb7v\" (UniqueName: \"kubernetes.io/projected/96949d92-2365-41eb-8f62-c264c8328c02-kube-api-access-lkb7v\") pod \"node-resolver-bh5lq\" (UID: \"96949d92-2365-41eb-8f62-c264c8328c02\") " pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.937477 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pn7q\" (UniqueName: \"kubernetes.io/projected/6806ff3c-ab3a-402e-b1c5-cc37c0810a65-kube-api-access-2pn7q\") pod \"multus-8q82q\" (UID: \"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\") " pod="openshift-multus/multus-8q82q" Sep 30 12:22:11 crc kubenswrapper[4672]: I0930 12:22:11.941419 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnk64\" (UniqueName: \"kubernetes.io/projected/95794952-d817-48f2-8956-f7a310f8d1d9-kube-api-access-rnk64\") pod \"machine-config-daemon-dpqrd\" (UID: \"95794952-d817-48f2-8956-f7a310f8d1d9\") " pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.013582 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bh5lq" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.022159 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.022345 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.022393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022473 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:22:13.022418926 +0000 UTC m=+24.291656572 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022543 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022560 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022571 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022587 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022592 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022600 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022655 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:13.022632482 +0000 UTC m=+24.291870128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.022584 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022716 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:13.022707604 +0000 UTC m=+24.291945500 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022770 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.022921 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022949 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.022966 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:13.02295526 +0000 UTC m=+24.292192906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.023153 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:13.023124074 +0000 UTC m=+24.292361750 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.032311 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.039787 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8q82q" Sep 30 12:22:12 crc kubenswrapper[4672]: W0930 12:22:12.043756 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95794952_d817_48f2_8956_f7a310f8d1d9.slice/crio-312127115c48c2289df85229b0a6d7e35e9eccd979bdf144a649b660c4138503 WatchSource:0}: Error finding container 312127115c48c2289df85229b0a6d7e35e9eccd979bdf144a649b660c4138503: Status 404 returned error can't find the container with id 312127115c48c2289df85229b0a6d7e35e9eccd979bdf144a649b660c4138503 Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.063967 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-89fj9"] Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.065891 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.068623 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.069865 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nznsk"] Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.072376 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.072703 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 12:22:12 crc kubenswrapper[4672]: W0930 12:22:12.072788 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6806ff3c_ab3a_402e_b1c5_cc37c0810a65.slice/crio-5ace5bd81d75df0ffa7644881e811bdda84c700a7fb7f2a1d82142e1cd416804 WatchSource:0}: Error finding container 5ace5bd81d75df0ffa7644881e811bdda84c700a7fb7f2a1d82142e1cd416804: Status 404 returned error can't find the container with id 5ace5bd81d75df0ffa7644881e811bdda84c700a7fb7f2a1d82142e1cd416804 Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.074891 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.075294 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.075312 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.075454 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.075631 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.075769 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.075876 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.081501 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.099322 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.114816 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.126120 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.142566 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"message\\\":\\\"W0930 12:21:53.138223 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 12:21:53.139223 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759234913 cert, and key in /tmp/serving-cert-529706298/serving-signer.crt, /tmp/serving-cert-529706298/serving-signer.key\\\\nI0930 12:21:53.505845 1 observer_polling.go:159] Starting file observer\\\\nW0930 12:21:53.510913 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 12:21:53.511041 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:21:53.511745 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-529706298/tls.crt::/tmp/serving-cert-529706298/tls.key\\\\\\\"\\\\nF0930 12:21:53.766381 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.154164 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.163604 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.177210 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.194332 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.206893 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.216897 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226247 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llgtp\" (UniqueName: \"kubernetes.io/projected/5da59bc9-84da-42f6-86e9-3399ecf31725-kube-api-access-llgtp\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226345 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-systemd-units\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226375 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226398 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-config\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226474 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5da59bc9-84da-42f6-86e9-3399ecf31725-ovn-node-metrics-cert\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226578 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-slash\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226615 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-systemd\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226644 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-ovn\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226687 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62gk\" (UniqueName: \"kubernetes.io/projected/72003fad-a0fd-4493-9f3b-6efdab22d14c-kube-api-access-w62gk\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226724 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-script-lib\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226765 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-cnibin\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226791 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-log-socket\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226819 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-var-lib-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226845 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226870 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-os-release\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226897 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72003fad-a0fd-4493-9f3b-6efdab22d14c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226923 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-kubelet\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226945 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-netns\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.226990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227019 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-bin\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227074 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-netd\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227122 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-node-log\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227141 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-ovn-kubernetes\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227158 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-system-cni-dir\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227175 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-etc-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227192 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72003fad-a0fd-4493-9f3b-6efdab22d14c-cni-binary-copy\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.227280 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-env-overrides\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.230191 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"message\\\":\\\"W0930 12:21:53.138223 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 12:21:53.139223 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759234913 cert, and key in /tmp/serving-cert-529706298/serving-signer.crt, /tmp/serving-cert-529706298/serving-signer.key\\\\nI0930 12:21:53.505845 1 observer_polling.go:159] Starting file observer\\\\nW0930 12:21:53.510913 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 12:21:53.511041 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:21:53.511745 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-529706298/tls.crt::/tmp/serving-cert-529706298/tls.key\\\\\\\"\\\\nF0930 12:21:53.766381 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.242126 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.255008 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.265672 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.280523 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.290763 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.299958 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.311076 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.323347 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328019 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-log-socket\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328090 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-os-release\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328131 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-var-lib-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328162 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328190 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72003fad-a0fd-4493-9f3b-6efdab22d14c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328220 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-kubelet\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328251 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-netns\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328311 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328340 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-bin\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328366 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-netd\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328419 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-node-log\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328447 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-ovn-kubernetes\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328485 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-system-cni-dir\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328523 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-etc-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328611 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72003fad-a0fd-4493-9f3b-6efdab22d14c-cni-binary-copy\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328653 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-env-overrides\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328685 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llgtp\" (UniqueName: \"kubernetes.io/projected/5da59bc9-84da-42f6-86e9-3399ecf31725-kube-api-access-llgtp\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328749 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-systemd-units\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328787 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328823 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-config\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328856 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5da59bc9-84da-42f6-86e9-3399ecf31725-ovn-node-metrics-cert\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328898 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-slash\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328930 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-systemd\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328976 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-ovn\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329006 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w62gk\" (UniqueName: \"kubernetes.io/projected/72003fad-a0fd-4493-9f3b-6efdab22d14c-kube-api-access-w62gk\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.328998 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-netd\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329053 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-script-lib\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329093 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-cnibin\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329199 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-cnibin\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329308 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-log-socket\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329332 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-node-log\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329443 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-ovn-kubernetes\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329490 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-kubelet\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329500 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-system-cni-dir\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329544 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-slash\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-var-lib-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329527 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-ovn\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329601 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329587 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329654 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-os-release\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329484 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-etc-openvswitch\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329732 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-systemd-units\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329605 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-systemd\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329784 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-netns\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329831 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72003fad-a0fd-4493-9f3b-6efdab22d14c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.329864 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-bin\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.330019 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-env-overrides\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.330406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72003fad-a0fd-4493-9f3b-6efdab22d14c-cni-binary-copy\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.330724 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-script-lib\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.330797 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-config\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.334171 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5da59bc9-84da-42f6-86e9-3399ecf31725-ovn-node-metrics-cert\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.340218 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.346824 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/72003fad-a0fd-4493-9f3b-6efdab22d14c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.357464 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llgtp\" (UniqueName: \"kubernetes.io/projected/5da59bc9-84da-42f6-86e9-3399ecf31725-kube-api-access-llgtp\") pod \"ovnkube-node-nznsk\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.378484 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62gk\" (UniqueName: \"kubernetes.io/projected/72003fad-a0fd-4493-9f3b-6efdab22d14c-kube-api-access-w62gk\") pod \"multus-additional-cni-plugins-89fj9\" (UID: \"72003fad-a0fd-4493-9f3b-6efdab22d14c\") " pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.386415 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-89fj9" Sep 30 12:22:12 crc kubenswrapper[4672]: W0930 12:22:12.400517 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72003fad_a0fd_4493_9f3b_6efdab22d14c.slice/crio-d62e2eb6f8f2d5f4bc6be6eddcd4b8d9a746d9b782035d859dfa2da82909939f WatchSource:0}: Error finding container d62e2eb6f8f2d5f4bc6be6eddcd4b8d9a746d9b782035d859dfa2da82909939f: Status 404 returned error can't find the container with id d62e2eb6f8f2d5f4bc6be6eddcd4b8d9a746d9b782035d859dfa2da82909939f Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.409475 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.410147 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.416520 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.416640 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:12 crc kubenswrapper[4672]: W0930 12:22:12.435072 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da59bc9_84da_42f6_86e9_3399ecf31725.slice/crio-76994a5d9cde6b9534d626b604ff2f6a0c5f17d81c62a75301d172c331c06d93 WatchSource:0}: Error finding container 76994a5d9cde6b9534d626b604ff2f6a0c5f17d81c62a75301d172c331c06d93: Status 404 returned error can't find the container with id 76994a5d9cde6b9534d626b604ff2f6a0c5f17d81c62a75301d172c331c06d93 Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.453556 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.561680 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.561737 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.561749 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"75f45a0d112cd4c85750be381b8265cb418a77435dbecb989ce23304d121cebf"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.562991 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerStarted","Data":"d62e2eb6f8f2d5f4bc6be6eddcd4b8d9a746d9b782035d859dfa2da82909939f"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.563899 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d798eb841f1090e6f15d91770c9cae1f06f2ae52e0cc152bea7fb286454fd91e"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.570718 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.577579 4672 scope.go:117] "RemoveContainer" containerID="a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be" Sep 30 12:22:12 crc kubenswrapper[4672]: E0930 12:22:12.577782 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.578612 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bh5lq" event={"ID":"96949d92-2365-41eb-8f62-c264c8328c02","Type":"ContainerStarted","Data":"d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.578670 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bh5lq" event={"ID":"96949d92-2365-41eb-8f62-c264c8328c02","Type":"ContainerStarted","Data":"50ec7fca2d20ff6959f676b53f20c7d73e99492d27e87c49d50d5bfe0aeed8cb"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.582737 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"76994a5d9cde6b9534d626b604ff2f6a0c5f17d81c62a75301d172c331c06d93"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.582946 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.587872 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerStarted","Data":"687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.587944 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerStarted","Data":"5ace5bd81d75df0ffa7644881e811bdda84c700a7fb7f2a1d82142e1cd416804"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.594712 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.594772 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"14ca5b66e47ec59784bebc2b270276bf3b8102f048f1b0a6683119e0f5fd0e09"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.598124 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.601796 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.601838 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.601849 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"312127115c48c2289df85229b0a6d7e35e9eccd979bdf144a649b660c4138503"} Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.613233 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.626982 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.651226 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.687899 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.727950 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.785532 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.818581 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.850649 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f71b069b915573d5cfcadeddd0be9f7c196c0aeafe8dbfe73424930432fa0c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"message\\\":\\\"W0930 12:21:53.138223 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 12:21:53.139223 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759234913 cert, and key in /tmp/serving-cert-529706298/serving-signer.crt, /tmp/serving-cert-529706298/serving-signer.key\\\\nI0930 12:21:53.505845 1 observer_polling.go:159] Starting file observer\\\\nW0930 12:21:53.510913 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 12:21:53.511041 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:21:53.511745 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-529706298/tls.crt::/tmp/serving-cert-529706298/tls.key\\\\\\\"\\\\nF0930 12:21:53.766381 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.896859 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.933673 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.973717 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.974557 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:12Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.981799 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 12:22:12 crc kubenswrapper[4672]: I0930 12:22:12.987410 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.014313 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.030545 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.033249 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.038639 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.038787 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.038828 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.038854 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:22:15.038823788 +0000 UTC m=+26.308061434 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.038917 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.038960 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.038961 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.038979 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.038981 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039055 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:15.039036923 +0000 UTC m=+26.308274569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039061 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039084 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039102 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039133 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039149 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039116 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:15.039099515 +0000 UTC m=+26.308337171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039206 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:15.039194487 +0000 UTC m=+26.308432223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.039230 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:15.039223208 +0000 UTC m=+26.308460964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.090076 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.093844 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.130232 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.180782 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.209171 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.247724 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.291306 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.332339 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.366650 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.370032 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.411625 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.416829 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.417005 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.416851 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.417423 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.420860 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.421673 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.422752 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.423346 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.423907 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.425153 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.425690 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.426794 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.427328 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.428178 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.428819 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.456697 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.488516 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.533745 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.565972 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ztjlj"] Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.566728 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.571958 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.579197 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.599642 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.605742 4672 generic.go:334] "Generic (PLEG): container finished" podID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerID="4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7" exitCode=0 Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.605812 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7"} Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.607038 4672 generic.go:334] "Generic (PLEG): container finished" podID="72003fad-a0fd-4493-9f3b-6efdab22d14c" containerID="31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f" exitCode=0 Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.607137 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerDied","Data":"31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f"} Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.608174 4672 scope.go:117] "RemoveContainer" containerID="a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be" Sep 30 12:22:13 crc kubenswrapper[4672]: E0930 12:22:13.608409 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.619567 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.640236 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.644745 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-host\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.644824 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ps67\" (UniqueName: \"kubernetes.io/projected/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-kube-api-access-6ps67\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.644879 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-serviceca\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.691617 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.740893 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.746381 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ps67\" (UniqueName: \"kubernetes.io/projected/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-kube-api-access-6ps67\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.746473 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-serviceca\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.747502 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-serviceca\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.747546 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-host\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.747639 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-host\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.771625 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.800642 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ps67\" (UniqueName: \"kubernetes.io/projected/25dc0f4c-76a8-49ff-bf68-c0718cfc3e62-kube-api-access-6ps67\") pod \"node-ca-ztjlj\" (UID: \"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\") " pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.828947 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.868542 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.884721 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ztjlj" Sep 30 12:22:13 crc kubenswrapper[4672]: W0930 12:22:13.902768 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25dc0f4c_76a8_49ff_bf68_c0718cfc3e62.slice/crio-b3e5a9eb056ed0824d6218dfdaa87081650bde380424dc2bd74ea761fe933f1e WatchSource:0}: Error finding container b3e5a9eb056ed0824d6218dfdaa87081650bde380424dc2bd74ea761fe933f1e: Status 404 returned error can't find the container with id b3e5a9eb056ed0824d6218dfdaa87081650bde380424dc2bd74ea761fe933f1e Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.914474 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:13 crc kubenswrapper[4672]: I0930 12:22:13.956946 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.062509 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.082207 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.097200 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.115179 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.150109 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.192904 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.232718 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.268015 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.312706 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.353097 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.397029 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.416607 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:14 crc kubenswrapper[4672]: E0930 12:22:14.416762 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.431307 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.466956 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.514951 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.551938 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.589925 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.613141 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ztjlj" event={"ID":"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62","Type":"ContainerStarted","Data":"a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.613205 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ztjlj" event={"ID":"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62","Type":"ContainerStarted","Data":"b3e5a9eb056ed0824d6218dfdaa87081650bde380424dc2bd74ea761fe933f1e"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.616521 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.616568 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.616581 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.616593 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.616605 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.618323 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerStarted","Data":"2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6"} Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.632561 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.673701 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.714401 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.747428 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.798830 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.829540 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.868320 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.910704 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.947940 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:14 crc kubenswrapper[4672]: I0930 12:22:14.998488 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.037772 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.060699 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.060812 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.060838 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.060896 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:22:19.060852819 +0000 UTC m=+30.330090475 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.060950 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.060969 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.060986 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.060963 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061038 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:19.061022983 +0000 UTC m=+30.330260629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.061095 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061132 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061151 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061167 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061183 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061222 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:19.061213408 +0000 UTC m=+30.330451054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061246 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:19.061238159 +0000 UTC m=+30.330475805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061302 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.061412 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:19.061392163 +0000 UTC m=+30.330629809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.069435 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.108609 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.149794 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.191905 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.229154 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.278081 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.309037 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.416193 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.416192 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.417506 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:15 crc kubenswrapper[4672]: E0930 12:22:15.424507 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.628784 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687"} Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.631199 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab"} Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.634284 4672 generic.go:334] "Generic (PLEG): container finished" podID="72003fad-a0fd-4493-9f3b-6efdab22d14c" containerID="2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6" exitCode=0 Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.634382 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerDied","Data":"2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6"} Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.653597 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.677419 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.697496 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.717666 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.730326 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.747226 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.766339 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.779369 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.796464 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.810235 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.822363 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.839173 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.852262 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.885030 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.912013 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.951307 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:15 crc kubenswrapper[4672]: I0930 12:22:15.988957 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.041408 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.075597 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.116349 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.153857 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.196559 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.231586 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.271353 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.312368 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.356117 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.395555 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.416468 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:16 crc kubenswrapper[4672]: E0930 12:22:16.416655 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.434825 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.468607 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.520495 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.640701 4672 generic.go:334] "Generic (PLEG): container finished" podID="72003fad-a0fd-4493-9f3b-6efdab22d14c" containerID="70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8" exitCode=0 Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.640896 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerDied","Data":"70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8"} Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.669517 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.683766 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.697908 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.709827 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.725439 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.750047 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.790192 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.829532 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.875432 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.900158 4672 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.903058 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.903097 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.903106 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.903228 4672 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.909252 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.961891 4672 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.962224 4672 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.963630 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.963663 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.963674 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.963691 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.963703 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:16Z","lastTransitionTime":"2025-09-30T12:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:16 crc kubenswrapper[4672]: E0930 12:22:16.985455 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.989000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.989037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.989046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.989061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.989073 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:16Z","lastTransitionTime":"2025-09-30T12:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:16 crc kubenswrapper[4672]: I0930 12:22:16.990222 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: E0930 12:22:17.009214 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.014114 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.014165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.014182 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.014207 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.014226 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.027171 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: E0930 12:22:17.029091 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.033438 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.033747 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.033895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.034039 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.034180 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: E0930 12:22:17.050368 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.054305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.054350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.054365 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.054388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.054403 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: E0930 12:22:17.069215 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: E0930 12:22:17.069408 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.071337 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.072040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.072092 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.072110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.072134 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.072152 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.116477 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.148915 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.175802 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.175848 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.175867 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.175891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.175908 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.278886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.278926 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.278942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.278965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.278983 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.381194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.381235 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.381248 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.381281 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.381296 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.416303 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.416349 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:17 crc kubenswrapper[4672]: E0930 12:22:17.416442 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:17 crc kubenswrapper[4672]: E0930 12:22:17.416605 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.484153 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.484198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.484211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.484227 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.484237 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.586994 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.587040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.587053 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.587068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.587078 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.648383 4672 generic.go:334] "Generic (PLEG): container finished" podID="72003fad-a0fd-4493-9f3b-6efdab22d14c" containerID="95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39" exitCode=0 Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.648477 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerDied","Data":"95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.654790 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.664047 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.690175 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.690254 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.690409 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.690440 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.690458 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.695152 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.714311 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.732348 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.750496 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.769865 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.793785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.793829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.793838 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.793852 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.793861 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.801442 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.818849 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.840875 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.861130 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.874975 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.892704 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.896136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.896169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.896181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.896198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.896210 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.904402 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.923023 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.934852 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:17Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.998388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.998430 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.998441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.998460 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:17 crc kubenswrapper[4672]: I0930 12:22:17.998504 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:17Z","lastTransitionTime":"2025-09-30T12:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.100827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.100870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.100883 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.100901 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.100913 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.203997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.204046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.204064 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.204085 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.204103 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.312422 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.312464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.312474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.312490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.312500 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.415228 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.415330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.415350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.415374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.415393 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.416145 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:18 crc kubenswrapper[4672]: E0930 12:22:18.416393 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.519456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.519533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.519553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.519578 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.519597 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.622479 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.622533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.622545 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.622563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.622577 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.662286 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerStarted","Data":"85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.687312 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.701362 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.712855 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.724747 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.725290 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.725335 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.725350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.725367 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.725377 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.741337 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.754192 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.771853 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.786598 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.807849 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.825121 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.827846 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.827871 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.827881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.827894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.827904 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.840849 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.855182 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.870011 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.883400 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.893483 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:18Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.929901 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.929939 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.929950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.929968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:18 crc kubenswrapper[4672]: I0930 12:22:18.929980 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:18Z","lastTransitionTime":"2025-09-30T12:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.033986 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.034569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.034589 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.034618 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.034642 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.107979 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.108170 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108247 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:22:27.108207522 +0000 UTC m=+38.377445178 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.108328 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108386 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108488 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:27.108461108 +0000 UTC m=+38.377698794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.108387 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108524 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108545 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108560 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108608 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:27.108598071 +0000 UTC m=+38.377835727 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.108633 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108759 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108804 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:27.108795576 +0000 UTC m=+38.378033242 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.108951 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.109013 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.109041 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.109158 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:27.109122074 +0000 UTC m=+38.378359920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.137042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.137097 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.137111 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.137133 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.137206 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.239561 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.239630 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.239644 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.239666 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.239681 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.343193 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.343246 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.343256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.343297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.343314 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.416040 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.416124 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.416208 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:19 crc kubenswrapper[4672]: E0930 12:22:19.416292 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.434197 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.446256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.446327 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.446341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.446399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.446415 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.452443 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.481294 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.499315 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.519868 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.545839 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.548540 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.548572 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.548586 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.548601 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.548613 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.561859 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.580098 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.596898 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.609539 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.630503 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.647089 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.651047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.651094 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.651105 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.651124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.651137 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.669514 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.670248 4672 generic.go:334] "Generic (PLEG): container finished" podID="72003fad-a0fd-4493-9f3b-6efdab22d14c" containerID="85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e" exitCode=0 Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.670325 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerDied","Data":"85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.689315 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.706795 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.749305 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.757797 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.757849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.757859 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.757880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.757891 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.793513 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.805678 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.822616 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.837100 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.855379 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.865159 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.865354 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.865446 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.865595 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.865720 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.873504 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.887338 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.897814 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.911504 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.925803 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.940942 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.963822 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.967955 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.967991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.968000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.968017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.968030 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:19Z","lastTransitionTime":"2025-09-30T12:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.983064 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:19 crc kubenswrapper[4672]: I0930 12:22:19.993440 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.071060 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.071109 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.071123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.071141 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.071154 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.173850 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.173908 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.173923 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.173941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.173953 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.276609 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.276675 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.276695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.276730 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.276749 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.380978 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.381025 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.381042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.381061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.381078 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.416147 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:20 crc kubenswrapper[4672]: E0930 12:22:20.416332 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.484380 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.484444 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.484459 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.484478 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.484490 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.586960 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.587006 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.587017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.587032 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.587041 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.678974 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.679320 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.679344 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.683763 4672 generic.go:334] "Generic (PLEG): container finished" podID="72003fad-a0fd-4493-9f3b-6efdab22d14c" containerID="f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646" exitCode=0 Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.683816 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerDied","Data":"f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.689098 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.689158 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.689186 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.689213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.689236 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.717772 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.730727 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.731109 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.738294 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.755331 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.774542 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.790187 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.792755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.792827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.792841 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.792864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.792892 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.807071 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.821384 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.837477 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.861606 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.876798 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.892596 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.895931 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.895986 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.895997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.896019 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.896034 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:20Z","lastTransitionTime":"2025-09-30T12:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.905729 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.921019 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.936335 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.947087 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.967664 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.981742 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:20 crc kubenswrapper[4672]: I0930 12:22:20.995543 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:20Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.000196 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.000242 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.000255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.000289 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.000301 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.015129 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.027901 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.053823 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.070338 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.087925 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.103132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.103193 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.103211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.103237 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.103285 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.111442 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.125145 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.136847 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.147874 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.160015 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.173610 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.182724 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.205329 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.205374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.205387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.205405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.205414 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.308557 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.308601 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.308611 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.308628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.308640 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.412149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.412217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.412237 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.412290 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.412310 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.416516 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.416558 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:21 crc kubenswrapper[4672]: E0930 12:22:21.416809 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:21 crc kubenswrapper[4672]: E0930 12:22:21.416973 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.515916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.515970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.515983 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.516005 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.516019 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.619709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.619763 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.619772 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.619789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.619801 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.692191 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.693705 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" event={"ID":"72003fad-a0fd-4493-9f3b-6efdab22d14c","Type":"ContainerStarted","Data":"fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.718152 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.726829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.726869 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.726878 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.726894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.726905 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.731998 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.743117 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.757051 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.773220 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.793968 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.807525 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.824169 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.841195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.841232 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.841243 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.841259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.841296 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.846230 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.861790 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.874092 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.886137 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.897505 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.911509 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.931155 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:21Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.944335 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.944379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.944393 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.944408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:21 crc kubenswrapper[4672]: I0930 12:22:21.944420 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:21Z","lastTransitionTime":"2025-09-30T12:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.047576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.047621 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.047631 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.047647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.047658 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.150082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.150130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.150144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.150163 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.150177 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.252893 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.253248 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.253395 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.253473 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.253541 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.355942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.355985 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.355995 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.356011 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.356022 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.417187 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:22 crc kubenswrapper[4672]: E0930 12:22:22.417372 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.457940 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.458042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.458054 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.458070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.458082 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.560975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.561027 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.561040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.561061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.561076 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.663392 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.663432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.663446 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.663464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.663476 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.695624 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.766043 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.766084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.766094 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.766109 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.766121 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.869619 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.869667 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.869686 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.869711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.869723 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.972823 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.972869 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.972880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.972898 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:22 crc kubenswrapper[4672]: I0930 12:22:22.972910 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:22Z","lastTransitionTime":"2025-09-30T12:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.076101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.076181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.076197 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.076220 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.076237 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.180141 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.180322 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.180338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.180361 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.180378 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.283615 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.283657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.283668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.283685 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.283700 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.386892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.386961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.386971 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.386991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.387003 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.416394 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.416483 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:23 crc kubenswrapper[4672]: E0930 12:22:23.416593 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:23 crc kubenswrapper[4672]: E0930 12:22:23.416712 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.490786 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.490881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.490893 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.490910 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.490921 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.594884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.594962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.594986 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.595022 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.595041 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.699165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.699209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.699226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.699247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.699286 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.802849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.802898 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.802908 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.802929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.802940 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.906001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.906455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.906537 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.906611 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:23 crc kubenswrapper[4672]: I0930 12:22:23.906683 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:23Z","lastTransitionTime":"2025-09-30T12:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.009239 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.009317 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.009331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.009352 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.009368 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.111710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.111772 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.111789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.111813 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.111827 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.214448 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.214864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.214959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.215060 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.215138 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.317789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.318099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.318195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.318320 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.318403 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.416733 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:24 crc kubenswrapper[4672]: E0930 12:22:24.416971 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.417076 4672 scope.go:117] "RemoveContainer" containerID="a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.421636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.421706 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.421721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.421745 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.421759 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.525661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.525717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.525731 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.525748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.525758 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.628584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.628626 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.628636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.628651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.628662 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.704218 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.706105 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.706486 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.708256 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/0.log" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.711706 4672 generic.go:334] "Generic (PLEG): container finished" podID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerID="fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50" exitCode=1 Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.711759 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.712979 4672 scope.go:117] "RemoveContainer" containerID="fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.726444 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.736448 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.736632 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.736757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.736848 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.736934 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.739415 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.750303 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.762554 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.775528 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.790077 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.802590 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.815009 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.826629 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk"] Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.827321 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.829660 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.829713 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.836451 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.838853 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.838895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.838906 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.838920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.838932 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.854317 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.868279 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.870796 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db981c53-85b2-4b3c-b025-da893771308f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.870845 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db981c53-85b2-4b3c-b025-da893771308f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.870893 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db981c53-85b2-4b3c-b025-da893771308f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.870910 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t8xk\" (UniqueName: \"kubernetes.io/projected/db981c53-85b2-4b3c-b025-da893771308f-kube-api-access-2t8xk\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.885371 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.900180 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.916977 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.934422 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.941629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.941673 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.941688 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.941707 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.941717 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:24Z","lastTransitionTime":"2025-09-30T12:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.949104 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.964491 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.971412 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db981c53-85b2-4b3c-b025-da893771308f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.971466 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db981c53-85b2-4b3c-b025-da893771308f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.971521 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db981c53-85b2-4b3c-b025-da893771308f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.971539 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t8xk\" (UniqueName: \"kubernetes.io/projected/db981c53-85b2-4b3c-b025-da893771308f-kube-api-access-2t8xk\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.972219 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db981c53-85b2-4b3c-b025-da893771308f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.972670 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db981c53-85b2-4b3c-b025-da893771308f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.980206 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.986245 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db981c53-85b2-4b3c-b025-da893771308f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.994577 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:24Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:24 crc kubenswrapper[4672]: I0930 12:22:24.995086 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t8xk\" (UniqueName: \"kubernetes.io/projected/db981c53-85b2-4b3c-b025-da893771308f-kube-api-access-2t8xk\") pod \"ovnkube-control-plane-749d76644c-485qk\" (UID: \"db981c53-85b2-4b3c-b025-da893771308f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.008202 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.032580 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.042438 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.044456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.044506 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.044521 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.044538 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.044550 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.061452 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.074829 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.084071 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.095351 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.108682 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.121581 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.134225 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.145400 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.146927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.146961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.146969 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.146984 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.146993 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.150708 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.168740 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"930 12:22:24.456648 5977 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.456996 5977 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 12:22:24.457536 5977 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.457652 5977 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.458388 5977 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 12:22:24.458405 5977 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 12:22:24.458443 5977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 12:22:24.458462 5977 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 12:22:24.458485 5977 factory.go:656] Stopping watch factory\\\\nI0930 12:22:24.458505 5977 ovnkube.go:599] Stopped ovnkube\\\\nI0930 12:22:24.458533 5977 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 12:22:24.458549 5977 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 12:22:24.458556 5977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 12:22:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.249225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.249279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.249289 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.249303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.249314 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.351595 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.351637 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.351650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.351667 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.351681 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.416947 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.417086 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:25 crc kubenswrapper[4672]: E0930 12:22:25.417246 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:25 crc kubenswrapper[4672]: E0930 12:22:25.417426 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.454676 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.454706 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.454716 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.454729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.454738 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.557829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.557884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.557895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.557914 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.557926 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.660230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.660311 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.660321 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.660344 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.660355 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.721580 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/0.log" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.733375 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.733597 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.735766 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" event={"ID":"db981c53-85b2-4b3c-b025-da893771308f","Type":"ContainerStarted","Data":"503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.735813 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" event={"ID":"db981c53-85b2-4b3c-b025-da893771308f","Type":"ContainerStarted","Data":"57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.735829 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" event={"ID":"db981c53-85b2-4b3c-b025-da893771308f","Type":"ContainerStarted","Data":"37040f1d4b85f95d0f51a1fb376ea697546661ada2054e6aef04f221230a8798"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.763385 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.763444 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.763458 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.763478 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.763489 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.764348 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.794719 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.809107 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.827193 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.849457 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.865873 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.865929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.865941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.865963 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.865978 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.868158 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.883842 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.898425 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.912693 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-n7wwp"] Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.913505 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:25 crc kubenswrapper[4672]: E0930 12:22:25.913637 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.918378 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"930 12:22:24.456648 5977 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.456996 5977 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 12:22:24.457536 5977 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.457652 5977 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.458388 5977 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 12:22:24.458405 5977 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 12:22:24.458443 5977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 12:22:24.458462 5977 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 12:22:24.458485 5977 factory.go:656] Stopping watch factory\\\\nI0930 12:22:24.458505 5977 ovnkube.go:599] Stopped ovnkube\\\\nI0930 12:22:24.458533 5977 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 12:22:24.458549 5977 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 12:22:24.458556 5977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 12:22:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.934246 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.948683 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.965964 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.968371 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.968424 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.968435 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.968452 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.968465 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:25Z","lastTransitionTime":"2025-09-30T12:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.980485 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.980556 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78cfr\" (UniqueName: \"kubernetes.io/projected/42618cd5-d9f9-45ba-8081-660ca47bebf4-kube-api-access-78cfr\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:25 crc kubenswrapper[4672]: I0930 12:22:25.982549 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.001286 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.015580 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.031555 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.048002 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.063004 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.070964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.071020 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.071036 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.071059 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.071078 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.081543 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.081934 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78cfr\" (UniqueName: \"kubernetes.io/projected/42618cd5-d9f9-45ba-8081-660ca47bebf4-kube-api-access-78cfr\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.082004 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:26 crc kubenswrapper[4672]: E0930 12:22:26.082163 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:26 crc kubenswrapper[4672]: E0930 12:22:26.082274 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:26.582240953 +0000 UTC m=+37.851478599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.094543 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.101585 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78cfr\" (UniqueName: \"kubernetes.io/projected/42618cd5-d9f9-45ba-8081-660ca47bebf4-kube-api-access-78cfr\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.109838 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.123490 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.141712 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.157881 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.173468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.173538 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.173551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.173606 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.173622 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.180852 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.194376 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.205455 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.220352 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.226137 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.235736 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.252195 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.271998 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"930 12:22:24.456648 5977 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.456996 5977 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 12:22:24.457536 5977 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.457652 5977 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.458388 5977 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 12:22:24.458405 5977 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 12:22:24.458443 5977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 12:22:24.458462 5977 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 12:22:24.458485 5977 factory.go:656] Stopping watch factory\\\\nI0930 12:22:24.458505 5977 ovnkube.go:599] Stopped ovnkube\\\\nI0930 12:22:24.458533 5977 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 12:22:24.458549 5977 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 12:22:24.458556 5977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 12:22:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.275592 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.275640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.275651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.275668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.275679 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.287955 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.300553 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.378590 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.378644 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.378656 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.378674 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.378686 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.416458 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:26 crc kubenswrapper[4672]: E0930 12:22:26.416636 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.481213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.481278 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.481291 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.481313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.481324 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.584953 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.585001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.585018 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.585041 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.585058 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.587083 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:26 crc kubenswrapper[4672]: E0930 12:22:26.587255 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:26 crc kubenswrapper[4672]: E0930 12:22:26.587362 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:27.587338304 +0000 UTC m=+38.856575980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.687231 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.687304 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.687319 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.687353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.687371 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.740811 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/1.log" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.741669 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/0.log" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.744850 4672 generic.go:334] "Generic (PLEG): container finished" podID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerID="43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b" exitCode=1 Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.744895 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.744930 4672 scope.go:117] "RemoveContainer" containerID="fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.745660 4672 scope.go:117] "RemoveContainer" containerID="43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b" Sep 30 12:22:26 crc kubenswrapper[4672]: E0930 12:22:26.745850 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.764125 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.778339 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.790249 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.790303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.790315 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.790331 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.790344 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.792558 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.808016 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.822578 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.840344 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.854135 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.871126 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.885912 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.892854 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.892914 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.892932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.892956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.892974 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.914291 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.930396 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.946093 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.965975 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbee2ee7be1b8b91a21a721e325065d47e0e59bf02acc8a328f2adaa419d8f50\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"message\\\":\\\"930 12:22:24.456648 5977 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.456996 5977 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 12:22:24.457536 5977 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.457652 5977 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 12:22:24.458388 5977 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 12:22:24.458405 5977 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 12:22:24.458443 5977 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 12:22:24.458462 5977 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0930 12:22:24.458485 5977 factory.go:656] Stopping watch factory\\\\nI0930 12:22:24.458505 5977 ovnkube.go:599] Stopped ovnkube\\\\nI0930 12:22:24.458533 5977 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 12:22:24.458549 5977 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 12:22:24.458556 5977 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 12:22:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.980701 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.995709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.995751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.995761 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.995774 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.995754 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:26Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:26 crc kubenswrapper[4672]: I0930 12:22:26.995783 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:26Z","lastTransitionTime":"2025-09-30T12:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.009292 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.023540 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.098231 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.098298 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.098313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.098341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.098373 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.192339 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.192473 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.192557 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:22:43.19252706 +0000 UTC m=+54.461764736 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.192565 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.192678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.192721 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.192775 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.192849 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.192894 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.192924 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.192921 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:43.192839268 +0000 UTC m=+54.462076914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.192944 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.193022 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:43.193004722 +0000 UTC m=+54.462242638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.193080 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:43.193056413 +0000 UTC m=+54.462294099 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.193242 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.193381 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.193415 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.193547 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:43.193508454 +0000 UTC m=+54.462746250 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.200705 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.200758 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.200776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.200799 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.200817 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.288579 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.288668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.288687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.288711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.288728 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.306558 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.312373 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.312436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.312456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.312485 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.312504 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.334536 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.340813 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.340871 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.340921 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.340941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.340956 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.363068 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.369089 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.369162 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.369177 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.369203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.369222 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.393506 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.398524 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.398567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.398577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.398592 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.398604 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.413628 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.413885 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416248 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416398 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.416420 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416476 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416611 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.416663 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.416877 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.417246 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.520007 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.520231 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.520377 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.520500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.520813 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.596983 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.597128 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.597186 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:29.597171296 +0000 UTC m=+40.866408932 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.623526 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.623573 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.623588 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.623609 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.623625 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.726590 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.726643 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.726657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.726678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.726696 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.750497 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/1.log" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.755759 4672 scope.go:117] "RemoveContainer" containerID="43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b" Sep 30 12:22:27 crc kubenswrapper[4672]: E0930 12:22:27.756038 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.774636 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.792550 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.808205 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.830066 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.830107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.830119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.830135 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.830148 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.833238 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.849669 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.862903 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.875172 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.887227 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.903324 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.919568 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.932531 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.933093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.933131 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.933147 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.933167 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.933182 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:27Z","lastTransitionTime":"2025-09-30T12:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.945040 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.961763 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:27 crc kubenswrapper[4672]: I0930 12:22:27.989095 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:27Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.005738 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:28Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.018892 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:28Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.033026 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:28Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.036203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.036259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.036303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.036328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.036343 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.139727 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.139796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.139812 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.139837 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.139855 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.242195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.242245 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.242257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.242300 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.242314 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.345689 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.345769 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.345789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.345819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.345841 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.416185 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:28 crc kubenswrapper[4672]: E0930 12:22:28.416683 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.450408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.450486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.450513 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.450552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.450582 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.554341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.554424 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.554447 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.554475 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.554496 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.658255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.658387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.658421 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.658459 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.658484 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.760773 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.760835 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.760861 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.760889 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.760920 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.866754 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.866814 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.866840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.866860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.866875 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.970719 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.970778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.970797 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.970825 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:28 crc kubenswrapper[4672]: I0930 12:22:28.970855 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:28Z","lastTransitionTime":"2025-09-30T12:22:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.074880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.075429 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.075701 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.075926 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.076131 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.180202 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.180318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.180344 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.180387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.180413 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.284636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.284705 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.284723 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.284751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.284776 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.388861 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.388991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.389013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.389044 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.389066 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.416641 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:29 crc kubenswrapper[4672]: E0930 12:22:29.416816 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.417599 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:29 crc kubenswrapper[4672]: E0930 12:22:29.417848 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.417919 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:29 crc kubenswrapper[4672]: E0930 12:22:29.418135 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.442250 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.461962 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.479866 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.492597 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.492664 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.492682 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.492708 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.492726 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.498065 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.512097 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.530345 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.543849 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.559453 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.593770 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.595665 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.595735 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.595754 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.595875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.595898 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.610511 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.620686 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.623908 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:29 crc kubenswrapper[4672]: E0930 12:22:29.624133 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:29 crc kubenswrapper[4672]: E0930 12:22:29.624204 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:33.624180701 +0000 UTC m=+44.893418347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.636429 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.651149 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.664052 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.677491 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.698826 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.698881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.698893 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.698916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.698932 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.706625 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.723957 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:29Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.802671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.802737 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.802751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.802772 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.802786 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.906114 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.906170 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.906180 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.906199 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:29 crc kubenswrapper[4672]: I0930 12:22:29.906212 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:29Z","lastTransitionTime":"2025-09-30T12:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.009583 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.009641 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.009655 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.009680 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.009696 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.112906 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.113194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.113300 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.113389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.113450 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.216389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.216727 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.216952 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.217145 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.217435 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.321002 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.321058 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.321076 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.321099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.321117 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.416672 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:30 crc kubenswrapper[4672]: E0930 12:22:30.416854 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.423882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.423956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.423981 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.424009 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.424033 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.527964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.528046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.528069 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.528099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.528121 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.631801 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.631875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.631899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.631929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.631953 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.735714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.735806 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.735828 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.735849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.735864 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.838927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.838988 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.839004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.839027 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.839046 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.942069 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.942167 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.942187 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.942211 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:30 crc kubenswrapper[4672]: I0930 12:22:30.942229 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:30Z","lastTransitionTime":"2025-09-30T12:22:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.045959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.046011 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.046024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.046046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.046059 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.149307 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.149376 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.149393 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.149415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.149433 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.252079 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.252155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.252172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.252200 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.252216 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.355218 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.355316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.355330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.355372 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.355385 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.417063 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.417128 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:31 crc kubenswrapper[4672]: E0930 12:22:31.417231 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.417254 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:31 crc kubenswrapper[4672]: E0930 12:22:31.417460 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:31 crc kubenswrapper[4672]: E0930 12:22:31.417697 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.458711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.458837 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.458889 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.458924 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.458949 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.561922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.561996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.562011 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.562039 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.562057 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.665031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.665141 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.665162 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.665183 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.665199 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.768450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.768536 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.768555 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.768581 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.768601 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.872584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.872695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.872714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.872742 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.872762 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.975689 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.975753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.975770 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.975793 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:31 crc kubenswrapper[4672]: I0930 12:22:31.975813 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:31Z","lastTransitionTime":"2025-09-30T12:22:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.079021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.079072 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.079091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.079115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.079133 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.182053 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.182148 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.182172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.182205 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.182232 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.285963 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.286019 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.286035 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.286058 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.286074 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.389886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.389952 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.389974 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.390002 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.390024 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.416483 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:32 crc kubenswrapper[4672]: E0930 12:22:32.416656 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.511858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.511920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.511968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.511993 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.512009 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.615487 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.615574 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.615589 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.615624 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.615639 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.718957 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.719021 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.719037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.719059 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.719076 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.820916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.820968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.820980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.820996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.821007 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.924037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.924104 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.924123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.924161 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:32 crc kubenswrapper[4672]: I0930 12:22:32.924221 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:32Z","lastTransitionTime":"2025-09-30T12:22:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.027769 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.027841 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.027867 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.027899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.027923 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.131332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.131389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.131407 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.131432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.131445 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.234935 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.234993 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.235012 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.235040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.235059 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.338086 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.338120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.338132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.338149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.338162 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.416448 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.416482 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:33 crc kubenswrapper[4672]: E0930 12:22:33.416566 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.416506 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:33 crc kubenswrapper[4672]: E0930 12:22:33.416835 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:33 crc kubenswrapper[4672]: E0930 12:22:33.416718 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.441297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.441387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.441402 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.441427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.441442 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.544834 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.544902 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.544925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.544958 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.544984 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.647552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.647597 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.647612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.647628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.647639 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.721084 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:33 crc kubenswrapper[4672]: E0930 12:22:33.721281 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:33 crc kubenswrapper[4672]: E0930 12:22:33.721343 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:41.721327568 +0000 UTC m=+52.990565214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.749982 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.750057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.750080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.750329 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.750356 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.854556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.854632 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.854654 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.854683 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.854705 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.956841 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.956885 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.956895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.956911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:33 crc kubenswrapper[4672]: I0930 12:22:33.956922 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:33Z","lastTransitionTime":"2025-09-30T12:22:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.060658 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.060704 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.060718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.060740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.060755 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.162928 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.162987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.162998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.163012 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.163022 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.266707 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.266780 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.266813 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.266855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.266886 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.369691 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.369749 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.369762 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.369780 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.369792 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.417057 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:34 crc kubenswrapper[4672]: E0930 12:22:34.417250 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.473586 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.473660 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.473683 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.473717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.473738 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.577041 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.577102 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.577119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.577141 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.577156 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.679923 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.679970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.679980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.679998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.680008 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.782774 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.782840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.782865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.782899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.782925 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.886525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.886677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.886694 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.886713 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.886726 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.989374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.989630 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.989639 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.989652 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:34 crc kubenswrapper[4672]: I0930 12:22:34.989662 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:34Z","lastTransitionTime":"2025-09-30T12:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.093174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.093219 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.093230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.093247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.093259 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.196032 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.196108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.196127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.196159 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.196178 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.300178 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.300239 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.300255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.300303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.300321 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.403382 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.403693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.403718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.403772 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.403792 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.415888 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:35 crc kubenswrapper[4672]: E0930 12:22:35.415999 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.416047 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.416052 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:35 crc kubenswrapper[4672]: E0930 12:22:35.416107 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:35 crc kubenswrapper[4672]: E0930 12:22:35.416157 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.506873 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.506955 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.506982 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.507017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.507044 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.610170 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.610600 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.610740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.610819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.610886 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.713875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.713929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.713940 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.713968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.713982 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.817037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.817083 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.817098 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.817115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.817129 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.920318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.920378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.920392 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.920412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:35 crc kubenswrapper[4672]: I0930 12:22:35.920423 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:35Z","lastTransitionTime":"2025-09-30T12:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.023187 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.023274 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.023345 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.023379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.023403 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.126932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.127032 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.127050 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.127084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.127103 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.230212 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.230299 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.230314 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.230336 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.230350 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.333943 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.334402 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.334566 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.334731 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.334856 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.415904 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:36 crc kubenswrapper[4672]: E0930 12:22:36.416046 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.438030 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.438098 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.438108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.438123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.438156 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.540480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.540526 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.540537 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.540551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.540559 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.643840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.643935 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.643953 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.644007 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.644030 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.746875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.746985 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.747024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.747060 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.747080 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.850400 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.850492 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.850511 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.850543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.850563 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.953406 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.953493 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.953510 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.953531 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:36 crc kubenswrapper[4672]: I0930 12:22:36.953545 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:36Z","lastTransitionTime":"2025-09-30T12:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.056869 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.056953 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.056970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.056991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.057034 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.160085 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.160153 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.160174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.160201 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.160220 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.262890 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.262930 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.262940 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.262955 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.262965 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.365980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.366052 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.366081 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.366117 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.366139 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.416115 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.416115 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.416297 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.416472 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.417132 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.417325 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.470389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.470475 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.470500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.470532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.470554 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.574005 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.574077 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.574094 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.574119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.574140 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.660656 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.660730 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.660752 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.660783 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.660803 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.684583 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:37Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.689554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.689609 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.689625 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.689647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.689663 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.706825 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:37Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.711964 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.712007 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.712023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.712044 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.712058 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.728166 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:37Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.732961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.733004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.733016 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.733034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.733045 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.746337 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:37Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.751260 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.751336 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.751354 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.751378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.751396 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.766017 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:37Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:37 crc kubenswrapper[4672]: E0930 12:22:37.766170 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.768224 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.768282 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.768302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.768318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.768331 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.871955 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.872011 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.872022 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.872057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.872073 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.976468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.976534 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.976555 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.976587 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:37 crc kubenswrapper[4672]: I0930 12:22:37.976602 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:37Z","lastTransitionTime":"2025-09-30T12:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.079858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.079929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.079944 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.079968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.079981 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.183179 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.183238 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.183250 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.183272 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.183286 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.286340 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.286387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.286399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.286417 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.286430 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.389318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.389397 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.389415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.389443 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.389461 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.415924 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:38 crc kubenswrapper[4672]: E0930 12:22:38.416149 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.492486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.492584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.492608 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.492643 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.492668 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.596191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.596244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.596257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.596294 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.596309 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.698976 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.699361 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.699456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.699548 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.699645 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.802326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.802731 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.802890 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.803004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.803091 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.906525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.906574 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.906589 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.906609 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:38 crc kubenswrapper[4672]: I0930 12:22:38.906622 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:38Z","lastTransitionTime":"2025-09-30T12:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.009511 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.009555 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.009571 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.009592 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.009607 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.111843 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.111897 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.111906 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.111922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.111933 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.215084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.215578 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.215613 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.215640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.215659 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.318895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.318936 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.318947 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.318965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.318976 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.416464 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.416677 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:39 crc kubenswrapper[4672]: E0930 12:22:39.416941 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.417324 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:39 crc kubenswrapper[4672]: E0930 12:22:39.417574 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:39 crc kubenswrapper[4672]: E0930 12:22:39.417784 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.423592 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.423693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.423717 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.423781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.423802 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.439621 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.463188 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.487437 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.503502 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.512652 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.524973 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.526377 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.527908 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.527962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.527975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.527993 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.528008 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.546727 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.561994 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.577097 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.600691 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.622209 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.631003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.631311 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.631485 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.631612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.631719 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.637639 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.653773 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.673377 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.689075 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.712838 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.725531 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.734827 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.734855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.734865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.734883 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.734895 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.741308 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.756673 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.770863 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.788733 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.806406 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.825854 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.837218 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.837285 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.837302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.837326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.837342 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.855597 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.871038 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.884453 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.900192 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.915847 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.933542 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.940136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.940176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.940186 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.940204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.940219 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:39Z","lastTransitionTime":"2025-09-30T12:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.953739 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.977079 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:39 crc kubenswrapper[4672]: I0930 12:22:39.991894 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:39Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.009895 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:40Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.042142 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:40Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.044045 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.044095 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.044111 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.044140 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.044157 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.066285 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:40Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.080866 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:40Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.148633 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.148700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.148720 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.148755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.148781 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.251636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.251682 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.251694 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.251714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.251728 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.355850 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.355903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.355920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.355946 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.355967 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.416047 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:40 crc kubenswrapper[4672]: E0930 12:22:40.416432 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.459997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.460089 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.460113 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.460144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.460167 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.563013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.563052 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.563062 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.563080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.563092 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.666052 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.666115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.666128 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.666149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.666163 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.768579 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.768624 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.768633 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.768649 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.768660 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.872164 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.872227 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.872240 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.872265 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.872293 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.976254 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.976359 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.976369 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.976390 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:40 crc kubenswrapper[4672]: I0930 12:22:40.976406 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:40Z","lastTransitionTime":"2025-09-30T12:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.079818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.079903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.079926 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.079957 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.079980 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.171790 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.183055 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.183142 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.183154 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.183173 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.183207 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.190402 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.203917 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.221526 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.236852 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.248603 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.275376 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.286721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.286777 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.286789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.286812 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.286825 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.322935 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.335577 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.348591 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.371087 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.385627 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.388831 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.388871 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.388884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.388904 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.388927 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.400418 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.415304 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.416539 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.416614 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:41 crc kubenswrapper[4672]: E0930 12:22:41.416696 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.416631 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:41 crc kubenswrapper[4672]: E0930 12:22:41.416819 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:41 crc kubenswrapper[4672]: E0930 12:22:41.416930 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.428609 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.446450 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.459940 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.472257 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.491009 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:41Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.492155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.492230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.492257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.492308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.492344 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.596206 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.596303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.596328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.596353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.596373 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.699365 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.699406 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.699418 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.699434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.699445 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.804533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.804583 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.804598 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.804628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.804643 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.819183 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:41 crc kubenswrapper[4672]: E0930 12:22:41.819411 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:41 crc kubenswrapper[4672]: E0930 12:22:41.819731 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:22:57.81970909 +0000 UTC m=+69.088946736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.907423 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.907484 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.907502 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.907526 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:41 crc kubenswrapper[4672]: I0930 12:22:41.907543 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:41Z","lastTransitionTime":"2025-09-30T12:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.011721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.011835 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.011871 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.011902 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.011924 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.115796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.115877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.115916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.115940 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.115954 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.218663 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.218721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.218737 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.218781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.218799 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.321866 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.321920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.321932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.321955 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.321966 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.416379 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:42 crc kubenswrapper[4672]: E0930 12:22:42.416636 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.417741 4672 scope.go:117] "RemoveContainer" containerID="43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.425101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.425208 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.425252 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.425313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.425334 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.528752 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.528837 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.528859 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.528886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.528906 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.632751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.632834 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.632858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.632896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.632929 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.737150 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.737249 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.737332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.737370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.737395 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.811543 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/1.log" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.814136 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.814653 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.832905 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.840202 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.840254 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.840267 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.840300 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.840312 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.855400 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.881535 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.897929 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.923673 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.939162 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.943247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.943295 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.943311 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.943328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.943338 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:42Z","lastTransitionTime":"2025-09-30T12:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.960758 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:42 crc kubenswrapper[4672]: I0930 12:22:42.989837 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.003087 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:42Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.015622 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.030346 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.045040 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.046217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.046249 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.046279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.046296 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.046309 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.058219 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.071966 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.089219 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.101616 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.125322 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.149644 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.149697 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.149709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.149726 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.149739 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.152516 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.232066 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.232199 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.232234 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232321 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232346 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:23:15.232314352 +0000 UTC m=+86.501552018 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232391 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:23:15.232377974 +0000 UTC m=+86.501615610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232403 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232424 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232438 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232504 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:23:15.232484946 +0000 UTC m=+86.501722592 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.232431 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232561 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232615 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232639 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232650 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:23:15.23264304 +0000 UTC m=+86.501880686 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.232564 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232664 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.232813 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:23:15.232796944 +0000 UTC m=+86.502034780 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.252462 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.252504 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.252516 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.252535 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.252547 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.356072 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.356145 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.356170 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.356203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.356233 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.416073 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.416172 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.416254 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.416357 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.416457 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.416661 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.459809 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.459860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.459870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.459904 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.459917 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.563059 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.563140 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.563163 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.563190 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.563211 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.665768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.665838 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.665861 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.665892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.665914 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.769251 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.769401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.769424 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.769448 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.769465 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.820981 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/2.log" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.822698 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/1.log" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.828159 4672 generic.go:334] "Generic (PLEG): container finished" podID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerID="b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8" exitCode=1 Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.828215 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.828354 4672 scope.go:117] "RemoveContainer" containerID="43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.829067 4672 scope.go:117] "RemoveContainer" containerID="b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8" Sep 30 12:22:43 crc kubenswrapper[4672]: E0930 12:22:43.829346 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.849176 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.866941 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.873629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.873692 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.873706 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.873733 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.873748 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.890486 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43ba42d9f8b7b2e60651ffc15c2291e5c9274fd32383509b048bfb2ea381d29b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"ntroller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:25Z is after 2025-08-24T17:21:41Z]\\\\nI0930 12:22:25.795759 6165 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.912489 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.939898 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.960371 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.977461 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.978553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.978583 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.978591 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.978605 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.978617 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:43Z","lastTransitionTime":"2025-09-30T12:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:43 crc kubenswrapper[4672]: I0930 12:22:43.998738 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:43Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.012821 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.033084 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.058558 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.081956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.082024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.082042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.082068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.082089 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.082063 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.103954 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.120551 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.137695 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.151158 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.167211 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.185302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.185367 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.185385 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.185407 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.185422 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.195790 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.293498 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.293552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.293567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.293585 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.293598 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.397297 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.397536 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.397596 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.397657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.397740 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.417052 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:44 crc kubenswrapper[4672]: E0930 12:22:44.417251 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.501255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.501392 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.501415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.501451 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.501472 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.606924 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.606983 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.606994 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.607019 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.607034 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.709914 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.709950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.709961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.709978 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.709991 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.813131 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.813241 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.813256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.813483 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.813498 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.833971 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/2.log" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.838865 4672 scope.go:117] "RemoveContainer" containerID="b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8" Sep 30 12:22:44 crc kubenswrapper[4672]: E0930 12:22:44.839043 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.855897 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.871740 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.885757 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.902352 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.916451 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.916528 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.916546 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.916568 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.916587 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:44Z","lastTransitionTime":"2025-09-30T12:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.922164 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.947120 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.964998 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.979517 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:44 crc kubenswrapper[4672]: I0930 12:22:44.993345 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:44Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.008623 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.018685 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.018729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.018743 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.018764 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.018778 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.024798 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.040308 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.056935 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.075132 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.086931 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.110640 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.121107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.121176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.121200 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.121350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.121446 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.126877 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.138702 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:45Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.224452 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.225034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.225246 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.225456 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.225646 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.328797 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.328848 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.328863 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.328881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.328894 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.416312 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.416312 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.416346 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:45 crc kubenswrapper[4672]: E0930 12:22:45.416493 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:45 crc kubenswrapper[4672]: E0930 12:22:45.416678 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:45 crc kubenswrapper[4672]: E0930 12:22:45.416743 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.431143 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.431193 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.431206 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.431224 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.431235 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.533991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.534107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.534121 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.534164 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.534179 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.637628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.637687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.637702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.637724 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.637741 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.747482 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.747536 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.747551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.747575 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.747592 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.850101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.850149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.850165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.850189 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.850205 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.953024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.953472 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.953507 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.953534 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:45 crc kubenswrapper[4672]: I0930 12:22:45.953552 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:45Z","lastTransitionTime":"2025-09-30T12:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.056530 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.056944 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.057118 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.057262 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.057482 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.161180 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.161236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.161254 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.161310 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.161329 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.264914 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.264975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.265001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.265070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.265097 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.368056 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.368123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.368134 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.368150 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.368159 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.416876 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:46 crc kubenswrapper[4672]: E0930 12:22:46.417036 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.470984 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.471078 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.471104 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.471139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.471162 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.574089 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.574150 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.574167 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.574194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.574211 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.677327 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.677384 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.677403 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.677426 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.677446 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.781158 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.781232 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.781255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.781326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.781352 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.885671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.885736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.885758 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.885779 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.885791 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.989026 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.989093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.989105 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.989128 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:46 crc kubenswrapper[4672]: I0930 12:22:46.989146 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:46Z","lastTransitionTime":"2025-09-30T12:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.091489 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.091563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.091576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.091600 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.091624 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.194155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.194213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.194225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.194242 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.194676 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.301512 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.301614 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.301643 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.301678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.301705 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.405231 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.405292 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.405304 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.405323 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.405337 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.416688 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.416705 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.416815 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.416876 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.417042 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.417191 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.509214 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.509306 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.509324 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.509348 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.509370 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.611104 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.611142 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.611149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.611187 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.611200 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.714192 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.714344 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.714371 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.714424 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.714446 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.770178 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.770251 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.770314 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.770349 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.770372 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.790880 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:47Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.797561 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.797621 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.797638 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.797661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.797678 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.814588 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:47Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.819869 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.819903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.819917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.819935 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.819946 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.842060 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:47Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.847172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.847230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.847244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.847292 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.847310 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.868415 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:47Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.873335 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.873386 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.873399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.873423 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.873439 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.889521 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:47Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:47 crc kubenswrapper[4672]: E0930 12:22:47.889654 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.891819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.891865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.891877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.891896 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.891910 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.995182 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.995248 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.995315 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.995351 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:47 crc kubenswrapper[4672]: I0930 12:22:47.995378 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:47Z","lastTransitionTime":"2025-09-30T12:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.098234 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.098329 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.098344 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.098363 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.098376 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.201569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.201891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.201981 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.202082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.202174 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.305498 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.305869 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.306027 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.306197 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.306384 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.410484 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.410563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.410574 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.410593 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.410624 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.416891 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:48 crc kubenswrapper[4672]: E0930 12:22:48.417091 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.514298 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.514382 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.514408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.514441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.514466 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.617858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.617931 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.617949 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.617980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.618000 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.722191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.722257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.722353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.722381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.722401 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.825845 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.825906 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.825922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.825944 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.825960 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.929572 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.929659 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.929680 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.929718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:48 crc kubenswrapper[4672]: I0930 12:22:48.929736 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:48Z","lastTransitionTime":"2025-09-30T12:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.033381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.033444 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.033455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.033474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.033486 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.137087 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.137181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.137209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.137244 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.137326 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.240444 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.240528 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.240552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.240582 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.240604 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.344071 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.344125 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.344143 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.344170 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.344188 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.416226 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.417199 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.417290 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:49 crc kubenswrapper[4672]: E0930 12:22:49.416573 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:49 crc kubenswrapper[4672]: E0930 12:22:49.417895 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:49 crc kubenswrapper[4672]: E0930 12:22:49.418515 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.444359 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.447757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.447968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.448107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.448255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.448428 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.464786 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.480377 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.497824 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.521758 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.536550 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.550440 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.550483 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.550500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.550523 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.550540 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.552379 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.563955 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.579515 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.593805 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.608761 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.620192 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.632523 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.645740 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.654070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.654144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.654170 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.654195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.654212 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.667668 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.684696 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.700465 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.717625 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:49Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.757659 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.757711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.757728 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.757749 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.757765 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.861684 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.861760 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.861787 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.861830 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.861854 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.965370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.965449 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.965476 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.965506 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:49 crc kubenswrapper[4672]: I0930 12:22:49.965532 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:49Z","lastTransitionTime":"2025-09-30T12:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.069353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.069407 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.069419 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.069437 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.069450 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.172094 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.172156 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.172174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.172204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.172223 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.275552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.275595 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.275607 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.275629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.275641 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.379216 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.379870 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.379884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.379907 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.379923 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.416287 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:50 crc kubenswrapper[4672]: E0930 12:22:50.416468 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.482628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.482675 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.482685 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.482700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.482713 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.584442 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.584480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.584489 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.584505 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.584517 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.688094 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.688159 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.688176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.688203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.688222 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.791421 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.791506 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.791530 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.791563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.791590 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.894650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.894703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.894713 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.894733 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.894747 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.997901 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.998009 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.998042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.998079 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:50 crc kubenswrapper[4672]: I0930 12:22:50.998106 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:50Z","lastTransitionTime":"2025-09-30T12:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.101906 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.101980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.101999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.102030 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.102054 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.204860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.204943 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.204967 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.205001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.205024 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.307804 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.307865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.307882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.307903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.307918 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.411091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.411212 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.411238 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.411256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.411305 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.416875 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:51 crc kubenswrapper[4672]: E0930 12:22:51.416998 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.417136 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.417136 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:51 crc kubenswrapper[4672]: E0930 12:22:51.417437 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:51 crc kubenswrapper[4672]: E0930 12:22:51.417585 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.514648 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.514710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.514723 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.514745 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.514760 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.618051 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.618136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.618586 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.618608 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.618622 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.721340 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.721765 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.721856 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.721960 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.722072 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.825037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.825511 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.825746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.825945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.826154 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.929315 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.929359 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.929370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.929387 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:51 crc kubenswrapper[4672]: I0930 12:22:51.929399 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:51Z","lastTransitionTime":"2025-09-30T12:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.032191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.032353 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.032441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.032477 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.032516 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.136004 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.136063 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.136072 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.136093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.136105 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.239330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.239396 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.239416 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.239447 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.239469 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.342948 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.343007 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.343023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.343045 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.343063 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.416964 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:52 crc kubenswrapper[4672]: E0930 12:22:52.417157 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.445944 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.446036 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.446065 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.446634 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.446913 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.549680 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.549729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.549747 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.549770 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.549787 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.652817 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.652865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.652882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.652907 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.652938 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.755722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.755781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.755799 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.755821 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.755840 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.858942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.858986 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.858998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.859015 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.859027 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.961840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.961903 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.961925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.961957 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:52 crc kubenswrapper[4672]: I0930 12:22:52.961982 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:52Z","lastTransitionTime":"2025-09-30T12:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.065682 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.065750 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.065761 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.065785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.065798 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.169106 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.169169 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.169185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.169210 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.169228 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.273084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.273147 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.273168 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.273194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.273214 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.375955 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.375998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.376010 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.376029 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.376041 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.416518 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.416627 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.416517 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:53 crc kubenswrapper[4672]: E0930 12:22:53.416774 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:53 crc kubenswrapper[4672]: E0930 12:22:53.416930 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:53 crc kubenswrapper[4672]: E0930 12:22:53.417144 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.479151 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.479213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.479229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.479253 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.479330 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.582565 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.582639 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.582661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.582692 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.582715 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.685695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.685739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.685750 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.685767 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.685782 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.788443 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.788544 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.788566 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.788591 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.788609 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.891790 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.892379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.892408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.892436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.892451 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.999339 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.999389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.999401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.999418 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:53 crc kubenswrapper[4672]: I0930 12:22:53.999432 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:53Z","lastTransitionTime":"2025-09-30T12:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.102816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.102901 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.102917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.102937 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.102953 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.205802 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.205881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.205905 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.205941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.205966 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.307916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.307972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.307989 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.308009 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.308026 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.411389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.411450 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.411469 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.411495 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.411514 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.416608 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:54 crc kubenswrapper[4672]: E0930 12:22:54.416720 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.513990 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.514045 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.514062 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.514084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.514100 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.616566 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.616658 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.616675 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.616698 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.616736 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.719618 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.719668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.719702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.719720 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.719732 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.822776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.822818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.822831 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.822847 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.822858 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.926011 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.926054 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.926063 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.926079 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:54 crc kubenswrapper[4672]: I0930 12:22:54.926088 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:54Z","lastTransitionTime":"2025-09-30T12:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.028280 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.028333 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.028347 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.028366 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.028382 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.130759 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.130833 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.130853 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.130881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.130897 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.233928 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.233987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.233996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.234013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.234023 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.336543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.336628 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.336662 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.336699 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.336721 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.416312 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.416346 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.416371 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:55 crc kubenswrapper[4672]: E0930 12:22:55.416484 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:55 crc kubenswrapper[4672]: E0930 12:22:55.416608 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:55 crc kubenswrapper[4672]: E0930 12:22:55.416758 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.439281 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.439318 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.439328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.439341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.439353 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.542973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.543044 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.543058 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.543079 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.543096 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.645560 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.645597 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.645605 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.645619 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.645629 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.748650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.748718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.748736 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.748761 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.748781 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.850940 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.850974 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.850983 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.850996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.851006 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.954143 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.954186 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.954195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.954208 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:55 crc kubenswrapper[4672]: I0930 12:22:55.954218 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:55Z","lastTransitionTime":"2025-09-30T12:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.056497 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.056540 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.056551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.056567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.056579 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.158843 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.158885 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.158894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.158907 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.158917 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.261486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.261540 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.261553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.261571 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.261584 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.364999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.365061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.365074 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.365095 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.365107 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.417026 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:56 crc kubenswrapper[4672]: E0930 12:22:56.417193 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.468115 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.468191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.468205 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.468229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.468245 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.571080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.571123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.571136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.571155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.571168 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.674233 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.674304 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.674316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.674334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.674345 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.776934 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.776974 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.776987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.777002 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.777014 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.879379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.879415 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.879426 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.879441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.879454 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.981843 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.981880 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.981897 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.981914 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:56 crc kubenswrapper[4672]: I0930 12:22:56.981925 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:56Z","lastTransitionTime":"2025-09-30T12:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.085117 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.085207 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.085228 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.085256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.085351 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.187550 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.187594 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.187605 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.187618 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.187631 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.290293 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.290337 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.290350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.290386 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.290399 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.392596 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.392654 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.392678 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.392702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.392721 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.416978 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.417035 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:57 crc kubenswrapper[4672]: E0930 12:22:57.417144 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.417202 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:57 crc kubenswrapper[4672]: E0930 12:22:57.417334 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:57 crc kubenswrapper[4672]: E0930 12:22:57.417421 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.495600 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.495655 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.495668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.495688 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.495702 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.598436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.598485 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.598500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.598524 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.598541 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.700306 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.700348 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.700358 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.700375 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.700387 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.805785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.805837 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.805856 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.805884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.805902 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.896286 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:57 crc kubenswrapper[4672]: E0930 12:22:57.896424 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:57 crc kubenswrapper[4672]: E0930 12:22:57.896469 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:23:29.896456133 +0000 UTC m=+101.165693779 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.908181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.908221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.908234 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.908292 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:57 crc kubenswrapper[4672]: I0930 12:22:57.908308 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:57Z","lastTransitionTime":"2025-09-30T12:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.011386 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.011444 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.011457 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.011478 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.011491 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.114284 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.114339 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.114354 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.114368 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.114378 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.141818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.141877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.141892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.141912 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.141927 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.154067 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:58Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.157640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.157670 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.157680 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.157693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.157703 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.168721 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:58Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.171928 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.171957 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.171965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.171979 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.171988 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.182903 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:58Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.186536 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.186581 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.186624 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.186642 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.186654 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.197238 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:58Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.200612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.200650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.200660 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.200677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.200690 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.213190 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:58Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.213317 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.216789 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.216826 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.216840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.216855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.216870 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.319812 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.319847 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.319862 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.319881 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.319893 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.416928 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.417329 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.417509 4672 scope.go:117] "RemoveContainer" containerID="b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8" Sep 30 12:22:58 crc kubenswrapper[4672]: E0930 12:22:58.417748 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.421888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.421922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.421934 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.421952 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.421966 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.525100 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.525139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.525148 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.525164 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.525175 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.628328 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.628374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.628391 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.628412 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.628424 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.732096 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.732142 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.732154 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.732176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.732189 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.835074 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.835311 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.835320 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.835333 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.835342 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.937549 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.937583 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.937594 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.937609 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:58 crc kubenswrapper[4672]: I0930 12:22:58.937620 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:58Z","lastTransitionTime":"2025-09-30T12:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.040501 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.040547 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.040556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.040573 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.040584 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.143467 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.143514 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.143523 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.143541 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.143552 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.245625 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.245657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.245665 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.245680 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.245691 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.348285 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.348323 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.348334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.348352 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.348364 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.416037 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.416068 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:22:59 crc kubenswrapper[4672]: E0930 12:22:59.416563 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.416160 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:22:59 crc kubenswrapper[4672]: E0930 12:22:59.416662 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:22:59 crc kubenswrapper[4672]: E0930 12:22:59.416443 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.431330 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.446812 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.450626 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.450785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.450876 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.450970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.451058 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.461514 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.479448 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.495427 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.508636 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.521464 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.534516 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.546525 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.553234 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.553279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.553291 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.553309 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.553323 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.556364 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.566892 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.576789 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.588141 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.606222 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.619229 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.630809 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.642840 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.653301 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:22:59Z is after 2025-08-24T17:21:41Z" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.655902 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.655953 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.655965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.655982 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.655994 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.758062 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.758108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.758121 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.758142 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.758156 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.860886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.860959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.860972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.860988 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.860997 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.964185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.964226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.964236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.964278 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:22:59 crc kubenswrapper[4672]: I0930 12:22:59.964291 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:22:59Z","lastTransitionTime":"2025-09-30T12:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.066408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.066479 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.066491 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.066510 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.066525 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.168623 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.168673 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.168685 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.168704 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.168720 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.271769 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.271830 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.271842 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.271858 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.271868 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.374217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.374256 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.374279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.374294 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.374304 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.416155 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:00 crc kubenswrapper[4672]: E0930 12:23:00.416300 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.477185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.477235 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.477247 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.477296 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.477310 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.580406 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.580479 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.580504 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.580534 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.580557 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.682703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.683401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.683425 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.683445 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.683454 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.787406 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.787441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.787449 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.787462 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.787471 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.889697 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.889748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.889757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.889772 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.889781 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.901571 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/0.log" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.901619 4672 generic.go:334] "Generic (PLEG): container finished" podID="6806ff3c-ab3a-402e-b1c5-cc37c0810a65" containerID="687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e" exitCode=1 Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.901646 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerDied","Data":"687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.902042 4672 scope.go:117] "RemoveContainer" containerID="687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.915978 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:00Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.930298 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:00Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.951318 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:00Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.968084 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:00Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.981550 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:00Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.993020 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.993052 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.993060 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.993075 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.993085 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:00Z","lastTransitionTime":"2025-09-30T12:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:00 crc kubenswrapper[4672]: I0930 12:23:00.994794 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:00Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.008221 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"2025-09-30T12:22:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193\\\\n2025-09-30T12:22:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193 to /host/opt/cni/bin/\\\\n2025-09-30T12:22:15Z [verbose] multus-daemon started\\\\n2025-09-30T12:22:15Z [verbose] Readiness Indicator file check\\\\n2025-09-30T12:23:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.022469 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.034411 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.045598 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.057060 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.070433 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.083736 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.095981 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.096075 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.096091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.096108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.096121 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.098318 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.114099 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.124540 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.136622 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.156941 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.199342 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.199424 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.199439 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.199464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.199485 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.302604 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.302651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.302663 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.302681 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.302695 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.405043 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.405093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.405107 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.405129 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.405145 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.416953 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.416970 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.417073 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:01 crc kubenswrapper[4672]: E0930 12:23:01.417112 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:01 crc kubenswrapper[4672]: E0930 12:23:01.417157 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:01 crc kubenswrapper[4672]: E0930 12:23:01.417253 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.508888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.508951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.508968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.508997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.509015 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.612427 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.612486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.612505 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.612530 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.612548 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.715471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.715511 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.715523 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.715543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.715555 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.818511 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.818561 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.818570 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.818587 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.818598 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.907723 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/0.log" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.907778 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerStarted","Data":"5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.921498 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.921551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.921563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.921584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.921596 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:01Z","lastTransitionTime":"2025-09-30T12:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.923037 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.936031 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.950650 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.968624 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:01 crc kubenswrapper[4672]: I0930 12:23:01.991628 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:01Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.012395 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.023334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.023360 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.023369 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.023565 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.023583 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.027602 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"2025-09-30T12:22:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193\\\\n2025-09-30T12:22:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193 to /host/opt/cni/bin/\\\\n2025-09-30T12:22:15Z [verbose] multus-daemon started\\\\n2025-09-30T12:22:15Z [verbose] Readiness Indicator file check\\\\n2025-09-30T12:23:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:23:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.043679 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.060500 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.082801 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.096422 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.108357 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.123969 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.125792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.125866 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.125890 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.125920 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.125947 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.139222 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.155492 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.173753 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.189399 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.211770 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:02Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.228838 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.228895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.228907 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.228926 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.228939 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.330950 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.331003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.331017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.331064 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.331075 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.416898 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:02 crc kubenswrapper[4672]: E0930 12:23:02.417033 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.434193 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.434463 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.434525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.434610 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.434686 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.537641 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.537691 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.537703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.537721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.537733 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.640877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.640929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.640944 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.640968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.640985 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.743645 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.743723 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.743740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.743760 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.743772 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.846957 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.846997 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.847010 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.847031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.847047 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.949607 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.949686 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.949712 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.949737 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:02 crc kubenswrapper[4672]: I0930 12:23:02.949750 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:02Z","lastTransitionTime":"2025-09-30T12:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.052405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.052480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.052491 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.052512 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.052524 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.155245 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.155326 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.155338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.155365 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.155385 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.258341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.258401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.258419 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.258441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.258458 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.361550 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.361629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.361649 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.361753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.361776 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.416709 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.416850 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:03 crc kubenswrapper[4672]: E0930 12:23:03.417060 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.417294 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:03 crc kubenswrapper[4672]: E0930 12:23:03.417341 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:03 crc kubenswrapper[4672]: E0930 12:23:03.417532 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.433359 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.465336 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.465381 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.465398 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.465425 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.465444 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.569075 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.569144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.569165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.569194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.569213 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.672709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.672756 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.672767 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.672787 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.672797 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.776829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.776918 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.776945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.776980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.777004 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.881221 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.881291 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.881304 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.881325 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.881349 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.984423 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.984493 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.984507 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.984533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:03 crc kubenswrapper[4672]: I0930 12:23:03.984549 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:03Z","lastTransitionTime":"2025-09-30T12:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.094451 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.094528 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.094549 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.094577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.094609 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.197857 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.197942 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.197956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.197977 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.197993 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.301325 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.301374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.301385 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.301409 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.301433 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.404643 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.404965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.405090 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.405166 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.405237 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.415976 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:04 crc kubenswrapper[4672]: E0930 12:23:04.416166 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.508679 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.508737 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.508756 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.508782 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.508802 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.614707 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.614794 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.614813 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.614850 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.614865 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.717981 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.718324 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.718458 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.718658 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.718922 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.821338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.821825 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.821918 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.821985 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.822051 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.924872 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.924949 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.924973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.925001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:04 crc kubenswrapper[4672]: I0930 12:23:04.925023 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:04Z","lastTransitionTime":"2025-09-30T12:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.028810 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.028875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.028888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.028912 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.028929 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.132522 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.132597 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.132612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.132640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.132659 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.235687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.235772 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.235791 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.235818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.235838 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.339063 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.339128 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.339147 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.339174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.339198 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.416252 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.416337 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:05 crc kubenswrapper[4672]: E0930 12:23:05.416459 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.416334 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:05 crc kubenswrapper[4672]: E0930 12:23:05.417247 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:05 crc kubenswrapper[4672]: E0930 12:23:05.417348 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.442602 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.442682 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.442709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.442748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.442776 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.546347 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.546718 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.546803 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.546879 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.546947 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.651436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.651510 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.651528 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.651557 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.651575 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.755335 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.755416 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.755436 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.755464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.755487 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.859068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.859143 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.859162 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.859188 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.859208 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.961509 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.961554 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.961563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.961578 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:05 crc kubenswrapper[4672]: I0930 12:23:05.961589 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:05Z","lastTransitionTime":"2025-09-30T12:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.064490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.064552 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.064564 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.064580 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.064591 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.167752 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.167800 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.167810 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.167832 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.167844 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.270767 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.271334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.271556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.271792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.272002 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.375962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.376026 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.376040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.376074 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.376088 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.416887 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:06 crc kubenswrapper[4672]: E0930 12:23:06.417404 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.479701 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.479755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.479764 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.479782 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.479793 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.583893 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.583956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.583975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.584003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.584020 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.687755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.687848 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.687866 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.687892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.687913 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.792031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.792090 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.792105 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.792130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.792147 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.895781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.895849 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.895864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.895888 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:06 crc kubenswrapper[4672]: I0930 12:23:06.895901 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:06Z","lastTransitionTime":"2025-09-30T12:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.024149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.024218 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.024236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.024257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.024288 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.131569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.131630 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.131649 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.131672 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.131690 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.234490 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.234537 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.234549 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.234567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.234579 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.338058 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.338132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.338157 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.338188 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.338213 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.417086 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.417119 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:07 crc kubenswrapper[4672]: E0930 12:23:07.417392 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.417431 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:07 crc kubenswrapper[4672]: E0930 12:23:07.417567 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:07 crc kubenswrapper[4672]: E0930 12:23:07.417707 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.440734 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.440781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.440796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.440818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.440834 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.543938 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.543988 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.544001 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.544023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.544035 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.647589 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.647710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.647739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.647816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.647875 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.751899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.751956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.751973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.751998 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.752015 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.855927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.856005 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.856031 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.856064 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.856088 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.958473 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.958521 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.958535 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.958556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:07 crc kubenswrapper[4672]: I0930 12:23:07.958569 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:07Z","lastTransitionTime":"2025-09-30T12:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.060598 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.060650 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.060667 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.060693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.060710 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.163708 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.163778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.163796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.163820 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.163837 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.268884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.268949 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.268966 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.268991 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.269009 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.372649 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.372703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.372719 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.372748 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.372770 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.416201 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:08 crc kubenswrapper[4672]: E0930 12:23:08.416399 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.476759 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.476834 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.476852 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.476874 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.476887 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.563967 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.564020 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.564030 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.564049 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.564062 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: E0930 12:23:08.578541 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:08Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.583500 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.583575 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.583594 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.583620 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.583637 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: E0930 12:23:08.603670 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:08Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.608883 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.608951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.608970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.608999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.609023 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: E0930 12:23:08.627748 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:08Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.631968 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.632005 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.632016 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.632035 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.632049 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: E0930 12:23:08.646932 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:08Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.650851 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.650886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.650894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.650909 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.650920 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: E0930 12:23:08.663099 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:08Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:08 crc kubenswrapper[4672]: E0930 12:23:08.663358 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.664645 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.664683 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.664693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.664708 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.664719 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.767007 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.767067 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.767084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.767111 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.767129 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.869768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.869837 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.869865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.869890 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.869909 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.973176 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.973320 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.973350 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.973385 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:08 crc kubenswrapper[4672]: I0930 12:23:08.973420 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:08Z","lastTransitionTime":"2025-09-30T12:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.076753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.076800 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.076813 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.076832 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.076847 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.179627 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.179686 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.179702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.179721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.179734 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.283671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.283772 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.283791 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.283828 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.283850 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.387128 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.387216 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.387243 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.387279 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.387330 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.415985 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.416141 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:09 crc kubenswrapper[4672]: E0930 12:23:09.416181 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.416213 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:09 crc kubenswrapper[4672]: E0930 12:23:09.416390 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:09 crc kubenswrapper[4672]: E0930 12:23:09.416484 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.436587 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.458787 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.475206 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.489442 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.489480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.489492 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.489510 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.489521 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.494876 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.529668 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.546882 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.562920 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.576918 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.592831 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.592884 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.592899 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.592923 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.592935 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.597817 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.612650 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.629114 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.646560 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"2025-09-30T12:22:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193\\\\n2025-09-30T12:22:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193 to /host/opt/cni/bin/\\\\n2025-09-30T12:22:15Z [verbose] multus-daemon started\\\\n2025-09-30T12:22:15Z [verbose] Readiness Indicator file check\\\\n2025-09-30T12:23:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:23:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.665090 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.687820 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.697006 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.697057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.697074 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.697095 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.697110 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.706335 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1404e16a-17d2-446b-8708-44d4b41c9f96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2424d1bc8a625cebd192f83733dc4bc01056e6166c918d21ded15aa3843b34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.734122 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.756326 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.775262 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.791652 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:09Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.799897 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.799951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.799962 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.799983 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.799998 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.902533 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.902579 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.902589 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.902616 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:09 crc kubenswrapper[4672]: I0930 12:23:09.902636 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:09Z","lastTransitionTime":"2025-09-30T12:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.005133 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.005180 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.005191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.005208 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.005221 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.107721 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.107750 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.107757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.107770 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.107779 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.215909 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.215959 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.215970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.215988 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.216001 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.319347 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.319397 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.319408 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.319426 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.319438 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.416961 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:10 crc kubenswrapper[4672]: E0930 12:23:10.417095 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.422016 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.422417 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.422437 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.422461 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.422479 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.525798 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.525847 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.525857 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.525874 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.525885 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.629458 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.629501 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.629512 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.629532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.629548 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.733671 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.733768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.733785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.733841 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.733860 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.837530 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.837596 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.837621 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.837652 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.837676 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.940609 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.940640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.940647 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.940659 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:10 crc kubenswrapper[4672]: I0930 12:23:10.940668 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:10Z","lastTransitionTime":"2025-09-30T12:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.043562 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.043672 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.043693 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.043714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.043726 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.147441 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.147529 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.147570 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.147605 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.147617 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.250857 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.250905 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.250915 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.250938 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.250948 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.354791 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.354879 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.354892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.354913 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.354926 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.416000 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.416207 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.416210 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:11 crc kubenswrapper[4672]: E0930 12:23:11.416421 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:11 crc kubenswrapper[4672]: E0930 12:23:11.416685 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:11 crc kubenswrapper[4672]: E0930 12:23:11.416936 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.458649 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.458696 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.458710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.458729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.458743 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.561996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.562042 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.562057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.562077 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.562096 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.664941 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.664987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.665003 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.665026 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.665041 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.768753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.768805 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.768814 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.768831 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.768841 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.871715 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.871774 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.871792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.871817 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.871838 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.974181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.974227 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.974241 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.974338 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:11 crc kubenswrapper[4672]: I0930 12:23:11.974476 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:11Z","lastTransitionTime":"2025-09-30T12:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.077569 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.077640 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.077665 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.077698 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.077719 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.182181 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.182230 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.182246 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.182295 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.182311 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.285910 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.286379 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.286601 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.286848 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.287041 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.390572 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.390645 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.390665 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.390694 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.390712 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.416458 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:12 crc kubenswrapper[4672]: E0930 12:23:12.416620 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.493015 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.493088 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.493106 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.493133 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.493151 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.602761 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.602925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.602956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.603714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.603758 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.705922 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.705971 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.705983 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.706000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.706011 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.809010 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.809082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.809102 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.809131 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.809153 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.912677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.912730 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.912746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.912771 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:12 crc kubenswrapper[4672]: I0930 12:23:12.912785 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:12Z","lastTransitionTime":"2025-09-30T12:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.016461 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.016544 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.016567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.016600 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.016628 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.119374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.119440 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.119453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.119474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.119490 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.222562 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.222637 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.222661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.222697 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.222759 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.325633 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.325702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.325722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.325746 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.325764 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.416931 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.416941 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.417362 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:13 crc kubenswrapper[4672]: E0930 12:23:13.417573 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:13 crc kubenswrapper[4672]: E0930 12:23:13.417652 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:13 crc kubenswrapper[4672]: E0930 12:23:13.417724 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.417858 4672 scope.go:117] "RemoveContainer" containerID="b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.428649 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.428711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.428731 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.428757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.428777 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.534769 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.534811 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.534826 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.534844 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.534857 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.638070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.638156 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.638172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.638203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.638221 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.741144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.741236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.741287 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.741308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.741322 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.845913 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.845976 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.845984 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.846000 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.846013 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.949547 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.949741 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.949759 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.949775 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.949786 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:13Z","lastTransitionTime":"2025-09-30T12:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.951569 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/2.log" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.954999 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63"} Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.955462 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.970501 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:13 crc kubenswrapper[4672]: I0930 12:23:13.987948 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:13Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.002212 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.017403 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.038907 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.052165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.052198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.052209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.052224 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.052236 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.056125 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.072711 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"2025-09-30T12:22:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193\\\\n2025-09-30T12:22:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193 to /host/opt/cni/bin/\\\\n2025-09-30T12:22:15Z [verbose] multus-daemon started\\\\n2025-09-30T12:22:15Z [verbose] Readiness Indicator file check\\\\n2025-09-30T12:23:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:23:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.096839 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.118383 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.128724 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1404e16a-17d2-446b-8708-44d4b41c9f96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2424d1bc8a625cebd192f83733dc4bc01056e6166c918d21ded15aa3843b34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.154817 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.154862 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.154872 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.154887 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.154898 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.157610 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.185180 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.201604 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.223969 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.238583 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.255857 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.257161 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.257188 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.257197 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.257215 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.257241 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.267128 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.277948 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.296662 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.359309 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.359348 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.359359 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.359376 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.359388 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.417040 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:14 crc kubenswrapper[4672]: E0930 12:23:14.417156 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.461987 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.462057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.462068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.462091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.462104 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.565046 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.565093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.565104 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.565119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.565132 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.667037 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.667087 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.667099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.667117 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.667134 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.769399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.769444 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.769454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.769468 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.769480 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.872225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.872316 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.872327 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.872347 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.872358 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.960600 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/3.log" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.961329 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/2.log" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.964323 4672 generic.go:334] "Generic (PLEG): container finished" podID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerID="fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63" exitCode=1 Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.964392 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.964449 4672 scope.go:117] "RemoveContainer" containerID="b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.965542 4672 scope.go:117] "RemoveContainer" containerID="fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63" Sep 30 12:23:14 crc kubenswrapper[4672]: E0930 12:23:14.965840 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.975102 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.975141 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.975155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.975175 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.975189 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:14Z","lastTransitionTime":"2025-09-30T12:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.982584 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:14 crc kubenswrapper[4672]: I0930 12:23:14.996334 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:14Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.018725 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b26be7f6864a66af6c7b4f7a000553fcfc870fb86c334481687a8a8bdfff54e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:22:43Z\\\",\\\"message\\\":\\\"s server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0930 12:22:43.375122 6393 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 12:22:43.375136 6393 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 1.633551ms\\\\nI0930 12:22:43.375146 6393 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI0930 12:22:43.374866 6393 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-config-operator/metrics]} name:Service_openshift-config-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.161:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f32857b5-f652-4313-a0d7-455c3156dd99}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 12:22:43.375233 6393 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:14Z\\\",\\\"message\\\":\\\"usterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 12:23:14.610518 6809 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0930 12:23:14.610520 6809 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 12:23:14.610530 6809 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0930 12:23:14.610528 6809 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-ovn-kubernetes/ovn-kubernetes-control-plane. OVN-Kubernetes controller took 0.120054282 seconds. No OVN measurement.\\\\nI0930 12:23:14.610460 6809 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0930 12:23:14.610588 6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.034770 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.050978 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.063182 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.077694 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.077740 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.077753 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.077970 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.078220 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.079688 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"2025-09-30T12:22:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193\\\\n2025-09-30T12:22:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193 to /host/opt/cni/bin/\\\\n2025-09-30T12:22:15Z [verbose] multus-daemon started\\\\n2025-09-30T12:22:15Z [verbose] Readiness Indicator file check\\\\n2025-09-30T12:23:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:23:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.099162 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.112083 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.124633 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.137619 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.152531 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.167180 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.181130 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.181188 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.181203 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.181225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.181238 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.190162 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.207762 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.218150 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.228525 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.239398 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1404e16a-17d2-446b-8708-44d4b41c9f96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2424d1bc8a625cebd192f83733dc4bc01056e6166c918d21ded15aa3843b34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.250150 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.283937 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.283979 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.283990 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.284010 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.284023 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.287405 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.287536 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.287617 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.287654 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287690 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.287673231 +0000 UTC m=+150.556910877 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.287725 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287690 4672 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287778 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287800 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287816 4672 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287748 4672 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287803 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.287793944 +0000 UTC m=+150.557031710 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287822 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287883 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.287860976 +0000 UTC m=+150.557098622 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287889 4672 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287900 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.287890526 +0000 UTC m=+150.557128282 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287901 4672 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.287939 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.287928337 +0000 UTC m=+150.557166063 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.387009 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.387051 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.387061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.387076 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.387086 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.417553 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.417688 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.417825 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.417836 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.417901 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.418076 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.490204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.490308 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.490324 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.490341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.490358 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.593800 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.593875 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.593897 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.593925 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.593948 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.697471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.697548 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.697568 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.697599 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.697617 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.801051 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.801093 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.801102 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.801120 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.801131 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.904405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.904453 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.904463 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.904483 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.904504 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:15Z","lastTransitionTime":"2025-09-30T12:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.970784 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/3.log" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.974980 4672 scope.go:117] "RemoveContainer" containerID="fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63" Sep 30 12:23:15 crc kubenswrapper[4672]: E0930 12:23:15.975129 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:23:15 crc kubenswrapper[4672]: I0930 12:23:15.995625 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e650bb6-f3f0-48f4-b169-1e56a907e88c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824f50fb36d4c4aee43cc92af10bc43c441f2cee0ee736431dfdc52d254b819a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2786191730ceba9bbcf35f5355098bc52b09c37d20b923f7c6c65fd1b584cea0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e5b3d468173222d5ae851529dbc318b7d727ec4af468e4094ab08b4f29bf100\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://217947162b121381c06a210145feb0b9bd45d4ad595fcd83b1a566bb6448f0ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8d2b013e19f7ff8f3eb9c42338191b2da273ce85db8bcc6be8dd1f69e83c3be\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T12:22:10Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 12:22:05.219065 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 12:22:05.219861 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-72394925/tls.crt::/tmp/serving-cert-72394925/tls.key\\\\\\\"\\\\nI0930 12:22:10.657058 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 12:22:10.659952 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 12:22:10.659972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 12:22:10.659997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 12:22:10.660003 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 12:22:10.667742 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0930 12:22:10.667775 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 12:22:10.667797 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667801 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 12:22:10.667808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 12:22:10.667812 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 12:22:10.667814 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 12:22:10.667817 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0930 12:22:10.670654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c3f5020d6f8fba5f79d7659bedba02479794acb1f5f3d219802966c781fd10d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://089787496c46eb514e279677b1853010b21eac240fb059e43866714d2a6920e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:15Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.006965 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.007010 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.007023 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.007043 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.007056 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.009613 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.025388 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95794952-d817-48f2-8956-f7a310f8d1d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a1de465a4e5deca994f640e87574ebd406be2d7a92c0e54ed8bece4a8e6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnk64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dpqrd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.040710 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8q82q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6806ff3c-ab3a-402e-b1c5-cc37c0810a65\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:00Z\\\",\\\"message\\\":\\\"2025-09-30T12:22:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193\\\\n2025-09-30T12:22:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3e4d0ad2-21ec-42dd-8da8-9cf62dde0193 to /host/opt/cni/bin/\\\\n2025-09-30T12:22:15Z [verbose] multus-daemon started\\\\n2025-09-30T12:22:15Z [verbose] Readiness Indicator file check\\\\n2025-09-30T12:23:00Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:23:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pn7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8q82q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.063875 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-89fj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72003fad-a0fd-4493-9f3b-6efdab22d14c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd26dcf24863265c55a06cfc788e84363473a6ca5f1b835fa5dbb683ebb82d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31087335de8b3b895d49c4677350e9ab6cfa0651188653dab05e7eafdab3d05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a607f773b81f6bdde800c2a36627d7f84602e6c3c777f6531391668dd7c08e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc6887f3dde42017d03aa3f247ca4eda56f93c3ec54a95ceabcc1c1b4651d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95e6805719bbdb9859b827860fb87ed979a4397061f76b433189f463c1e7cc39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85068f88c57bccc42cc7234d13298a30c574379f4a212059a9924553fce8620e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4355ef186193e8e0d9344faf8975065f9cd4576e7f6c79054670fda9acf3646\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w62gk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-89fj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.078764 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ztjlj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25dc0f4c-76a8-49ff-bf68-c0718cfc3e62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5a3ba5101b35ce4981484637222a0c074b67e3998c273dc763ddbc6ca0c0c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6ps67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ztjlj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.095145 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db981c53-85b2-4b3c-b025-da893771308f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57231120d83ed4e95272ab2b0bac2146b066dde2b5ca3afe469a3dc36d6187ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://503616f93ea6b36f5e3c95c88471c6d06c281a4d0be102b6cb0edf6a1513f78c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t8xk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-485qk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.108698 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42618cd5-d9f9-45ba-8081-660ca47bebf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78cfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:25Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7wwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.109668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.109720 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.109751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.109776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.109796 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.123100 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.133707 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1404e16a-17d2-446b-8708-44d4b41c9f96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2424d1bc8a625cebd192f83733dc4bc01056e6166c918d21ded15aa3843b34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.156626 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.169565 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.182378 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.197973 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.212722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.212831 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.212850 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.212882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.212903 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.215593 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"763dda13-8721-4cce-8b21-aa1f1b190978\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6412d7fdf1976c75ccfc72d05f00e2997d72f3971c73cd0887e038d1cbc059\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5575589d0517d5a1c285c3e9e566fb5cd9b9591a1b311a4bb80e595e44044c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b656dc1ad8af099828eaaeb654e6d52dbbd057023dd07583a6114316670aa334\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5082c3fa78648584ab9a29b4ab239ce0d52411666ca3fa598e9c6745b8ea067e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.232294 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2761ae4b8b1c7e1d00ee269219ff0ab04aedce9e1bddc6e0a029c496ee89c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.246424 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.259634 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://534671434fa48a07e7836049331af60307cf651f4ddde83856ad98e006b32040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfb0bceb3e99196adb442abedd5215938781b908c2aa92178c9db76672664620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.285203 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5da59bc9-84da-42f6-86e9-3399ecf31725\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T12:23:14Z\\\",\\\"message\\\":\\\"usterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0930 12:23:14.610518 6809 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0930 12:23:14.610520 6809 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 12:23:14.610530 6809 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0930 12:23:14.610528 6809 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-ovn-kubernetes/ovn-kubernetes-control-plane. OVN-Kubernetes controller took 0.120054282 seconds. No OVN measurement.\\\\nI0930 12:23:14.610460 6809 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI0930 12:23:14.610588 6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T12:23:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-llgtp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nznsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:16Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.316525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.316602 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.316629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.316660 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.316680 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.416310 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:16 crc kubenswrapper[4672]: E0930 12:23:16.416548 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.420051 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.420110 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.420128 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.420204 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.420225 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.524215 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.524312 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.524368 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.524419 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.524446 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.627820 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.627863 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.627878 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.627904 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.627919 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.731750 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.731800 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.731816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.731833 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.731848 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.834313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.834360 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.834377 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.834418 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.834431 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.937198 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.937228 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.937236 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.937251 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:16 crc kubenswrapper[4672]: I0930 12:23:16.937279 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:16Z","lastTransitionTime":"2025-09-30T12:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.041038 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.041073 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.041082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.041096 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.041108 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.144725 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.144781 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.144791 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.144809 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.144820 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.254034 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.254080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.254092 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.254111 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.254124 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.357471 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.357532 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.357551 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.357576 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.357596 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.416888 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:17 crc kubenswrapper[4672]: E0930 12:23:17.417131 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.416907 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:17 crc kubenswrapper[4672]: E0930 12:23:17.417396 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.417431 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:17 crc kubenswrapper[4672]: E0930 12:23:17.417590 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.461423 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.461494 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.461517 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.461546 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.461566 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.565075 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.565121 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.565134 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.565152 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.565166 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.668862 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.668927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.668960 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.668999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.669024 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.772439 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.772535 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.772553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.772578 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.772600 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.876714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.876776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.876793 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.876816 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.876834 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.983257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.983358 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.983378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.983413 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:17 crc kubenswrapper[4672]: I0930 12:23:17.983442 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:17Z","lastTransitionTime":"2025-09-30T12:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.086917 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.086996 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.087020 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.087054 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.087075 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.190298 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.190357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.190370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.190388 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.190401 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.294629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.294739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.294768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.294804 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.294828 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.398836 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.398911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.398929 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.398956 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.398979 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.416913 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:18 crc kubenswrapper[4672]: E0930 12:23:18.417140 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.503702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.503775 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.503795 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.503823 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.503844 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.608434 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.608504 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.608526 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.608556 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.608574 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.712305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.712383 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.712402 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.712432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.712453 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.816070 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.816123 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.816136 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.816158 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.816173 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.918702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.918774 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.918792 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.918818 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:18 crc kubenswrapper[4672]: I0930 12:23:18.918836 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:18Z","lastTransitionTime":"2025-09-30T12:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.022139 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.022213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.022232 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.022260 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.022384 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.053511 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.053571 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.053583 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.053604 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.053617 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.075424 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.080980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.081047 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.081064 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.081089 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.081108 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.103630 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.108582 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.108651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.108669 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.108696 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.108716 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.126791 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.131329 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.131378 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.131389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.131405 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.131418 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.145967 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.151431 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.151464 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.151475 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.151488 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.151497 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.171925 4672 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T12:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a240ac94-c7cd-47b9-85fc-82a9db2c4d67\\\",\\\"systemUUID\\\":\\\"9545f671-f742-45e6-b9f7-3b3404b22825\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.172059 4672 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.173586 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.173686 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.173705 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.173768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.173787 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.277409 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.277462 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.277480 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.277511 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.277532 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.380674 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.380754 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.380777 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.380805 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.380827 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.416572 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.416783 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.417130 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.417241 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.417550 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:19 crc kubenswrapper[4672]: E0930 12:23:19.417766 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.455613 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4339dd7c-5112-40c5-9b58-050ce364a2b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7db3193c2afadb91d3220e67720ff33bc959f9c50cec6f243144a6e2a2e57e04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f54820ce67d6e8f397d068e1fadcb4577d6685406812c4e29fe0274612697d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1351a4e8380fea9edb834239053b57384a50add562afe59e67bbd7159bf1bf6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b9741c5edaae364733b400bebf21a4674ec206194908ec6a942317235c62a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775be30891ac2947ef160e3fe75abad42b78fc39e35414f27756ce086fed4f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27513f0766801d5155c6e5c376a2a52241030eca2171b3218710ca6ee006a0b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0afdc82c21601c37c7d9b4897273ade1e8e35bd093d63df9a63e93a9c0c6228c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52fa5ee34bc8e9e381d4288d30fce5db76ed5bde6f199ec13fe443481ed4c8e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.469143 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.483834 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.483926 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.483951 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.483980 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.483999 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.483994 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bh5lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96949d92-2365-41eb-8f62-c264c8328c02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6caf59dcbea4e0beb0879de377fe92cfb0b3ff55668aee3f10822adacf97fd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkb7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:22:11Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bh5lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.503409 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8095997-9982-47c1-850a-260c9e369680\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f437372ebc379d35a2232ab47422ed6127cd29b0d752488dca512f67407d222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c905db0424e34f487d2db657676de97a3e323ef4c9fbed5d25929164476f62bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb1c08c9fe886c57b5abe5091fa9845022bb33eefd97da1bf595d4b32016dfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9e2b46f485ed7a73177e89960925d6b0f9945bf0b7b5604a5b5e84d40f74e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.516675 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1404e16a-17d2-446b-8708-44d4b41c9f96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T12:21:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2424d1bc8a625cebd192f83733dc4bc01056e6166c918d21ded15aa3843b34d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://184b9352d3463790085ac81457e5ad33d08ea4663346d96f4c4034dcca6a7dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T12:21:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T12:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T12:21:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.533336 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ece716e9efed328b509480f11b1a6aa253fe23776bf45f49d4110cf3b4acbab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T12:22:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.546171 4672 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T12:22:11Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T12:23:19Z is after 2025-08-24T17:21:41Z" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.587151 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.587200 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.587209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.587223 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.587235 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.641192 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.641175431 podStartE2EDuration="1m7.641175431s" podCreationTimestamp="2025-09-30 12:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:19.624393463 +0000 UTC m=+90.893631179" watchObservedRunningTime="2025-09-30 12:23:19.641175431 +0000 UTC m=+90.910413077" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.655762 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podStartSLOduration=69.655744193 podStartE2EDuration="1m9.655744193s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:19.655450275 +0000 UTC m=+90.924687931" watchObservedRunningTime="2025-09-30 12:23:19.655744193 +0000 UTC m=+90.924981839" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.669037 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8q82q" podStartSLOduration=69.669014971 podStartE2EDuration="1m9.669014971s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:19.66895418 +0000 UTC m=+90.938191846" watchObservedRunningTime="2025-09-30 12:23:19.669014971 +0000 UTC m=+90.938252617" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.689149 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.689184 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.689197 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.689215 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.689229 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.703808 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-89fj9" podStartSLOduration=69.703785438 podStartE2EDuration="1m9.703785438s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:19.688315793 +0000 UTC m=+90.957553449" watchObservedRunningTime="2025-09-30 12:23:19.703785438 +0000 UTC m=+90.973023094" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.717096 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ztjlj" podStartSLOduration=69.717078947 podStartE2EDuration="1m9.717078947s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:19.704233299 +0000 UTC m=+90.973470945" watchObservedRunningTime="2025-09-30 12:23:19.717078947 +0000 UTC m=+90.986316603" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.717195 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-485qk" podStartSLOduration=68.7171885 podStartE2EDuration="1m8.7171885s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:19.716469762 +0000 UTC m=+90.985707418" watchObservedRunningTime="2025-09-30 12:23:19.7171885 +0000 UTC m=+90.986426156" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.753608 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.753589928 podStartE2EDuration="1m8.753589928s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:19.753434294 +0000 UTC m=+91.022671950" watchObservedRunningTime="2025-09-30 12:23:19.753589928 +0000 UTC m=+91.022827584" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.791577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.791626 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.791636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.791657 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.791668 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.893700 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.894105 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.894121 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.894144 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.894166 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.996999 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.997067 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.997082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.997129 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:19 crc kubenswrapper[4672]: I0930 12:23:19.997143 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:19Z","lastTransitionTime":"2025-09-30T12:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.100509 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.100574 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.100588 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.100610 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.100623 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.203774 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.203855 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.203882 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.203911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.203931 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.307022 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.307084 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.307101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.307125 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.307145 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.410452 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.410529 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.410541 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.410558 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.410572 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.416797 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:20 crc kubenswrapper[4672]: E0930 12:23:20.417038 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.513639 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.513695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.513709 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.513728 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.513742 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.616786 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.616828 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.616843 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.616864 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.616879 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.720666 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.720714 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.720724 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.720744 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.720760 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.823711 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.823770 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.823779 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.823796 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.823810 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.926629 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.926673 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.926682 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.926697 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:20 crc kubenswrapper[4672]: I0930 12:23:20.926709 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:20Z","lastTransitionTime":"2025-09-30T12:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.029904 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.029958 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.029972 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.029993 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.030009 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.133659 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.133732 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.133755 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.133785 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.133809 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.236606 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.236674 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.236694 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.236725 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.236748 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.339172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.339215 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.339226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.339242 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.339254 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.416140 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.416308 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:21 crc kubenswrapper[4672]: E0930 12:23:21.416400 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:21 crc kubenswrapper[4672]: E0930 12:23:21.416480 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.416153 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:21 crc kubenswrapper[4672]: E0930 12:23:21.416760 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.441229 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.441290 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.441302 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.441317 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.441329 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.544174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.544239 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.544259 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.544313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.544335 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.647812 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.647892 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.647916 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.647945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.647964 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.751330 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.751370 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.751383 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.751401 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.751414 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.855209 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.855303 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.855332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.855363 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.855382 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.958887 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.958995 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.959017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.959076 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:21 crc kubenswrapper[4672]: I0930 12:23:21.959103 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:21Z","lastTransitionTime":"2025-09-30T12:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.064924 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.064992 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.065040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.065069 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.065087 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.168098 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.168155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.168174 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.168205 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.168225 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.272127 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.272186 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.272205 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.272233 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.272252 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.375815 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.375876 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.375895 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.375927 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.375948 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.416015 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:22 crc kubenswrapper[4672]: E0930 12:23:22.416217 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.479567 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.479633 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.479648 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.479674 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.479694 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.584239 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.584384 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.584404 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.584429 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.584448 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.688486 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.688564 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.688584 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.688611 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.688632 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.793768 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.793844 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.793863 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.793894 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.793916 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.897621 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.897674 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.897687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.897710 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:22 crc kubenswrapper[4672]: I0930 12:23:22.897724 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:22Z","lastTransitionTime":"2025-09-30T12:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.000153 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.000239 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.000295 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.000333 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.000356 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.104222 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.104285 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.104304 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.104323 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.104334 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.207687 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.207745 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.207757 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.207783 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.207796 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.311098 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.311185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.311207 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.311239 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.311309 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.415068 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.415119 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.415132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.415155 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.415169 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.416001 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.416055 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:23 crc kubenswrapper[4672]: E0930 12:23:23.416119 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.416160 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:23 crc kubenswrapper[4672]: E0930 12:23:23.416362 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:23 crc kubenswrapper[4672]: E0930 12:23:23.416627 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.518017 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.518171 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.518191 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.518217 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.518239 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.621932 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.621995 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.622013 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.622040 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.622059 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.725024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.725091 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.725112 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.725141 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.725162 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.830096 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.830170 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.830200 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.830240 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.830309 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.934185 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.934231 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.934242 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.934261 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:23 crc kubenswrapper[4672]: I0930 12:23:23.934313 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:23Z","lastTransitionTime":"2025-09-30T12:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.037992 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.038583 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.038603 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.038633 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.038653 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.142537 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.142603 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.142622 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.142655 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.142672 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.246024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.246126 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.246151 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.246195 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.246223 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.350101 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.350661 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.350840 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.350977 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.351111 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.415957 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:24 crc kubenswrapper[4672]: E0930 12:23:24.416105 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.454526 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.454592 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.454610 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.454636 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.454654 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.557579 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.557668 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.557690 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.557716 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.557734 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.661632 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.661702 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.661722 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.661752 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.661771 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.765099 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.765170 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.765194 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.765223 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.765245 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.868891 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.868935 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.868945 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.868961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.868972 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.972973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.973066 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.973095 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.973134 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:24 crc kubenswrapper[4672]: I0930 12:23:24.973159 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:24Z","lastTransitionTime":"2025-09-30T12:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.076708 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.076784 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.076810 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.076838 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.076859 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.180655 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.180739 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.180754 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.180780 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.180797 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.284597 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.284679 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.284703 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.284733 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.284763 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.387819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.387865 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.387877 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.387897 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.387910 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.416952 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.416978 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:25 crc kubenswrapper[4672]: E0930 12:23:25.417096 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:25 crc kubenswrapper[4672]: E0930 12:23:25.417351 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.417427 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:25 crc kubenswrapper[4672]: E0930 12:23:25.417822 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.491577 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.491644 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.491659 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.491681 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.491698 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.595653 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.595726 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.595751 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.595778 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.595795 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.699061 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.699112 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.699124 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.699143 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.699156 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.803188 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.803307 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.803334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.803367 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.803390 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.907255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.907341 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.907357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.907382 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:25 crc kubenswrapper[4672]: I0930 12:23:25.907401 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:25Z","lastTransitionTime":"2025-09-30T12:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.010612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.010655 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.010666 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.010686 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.010699 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.113612 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.113677 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.113691 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.113715 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.113727 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.217015 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.217065 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.217088 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.217108 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.217162 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.320104 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.320172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.320193 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.320222 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.320244 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.416962 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:26 crc kubenswrapper[4672]: E0930 12:23:26.417181 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.423213 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.423305 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.423334 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.423362 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.423383 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.525961 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.526009 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.526024 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.526043 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.526056 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.629092 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.629137 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.629146 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.629162 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.629173 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.731985 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.732035 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.732049 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.732066 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.732078 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.834478 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.834548 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.834568 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.834591 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.834605 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.937257 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.937374 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.937399 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.937432 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:26 crc kubenswrapper[4672]: I0930 12:23:26.937457 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:26Z","lastTransitionTime":"2025-09-30T12:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.040776 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.040819 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.040829 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.040844 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.040856 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.144005 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.144045 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.144063 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.144086 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.144101 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.247220 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.247317 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.247337 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.247366 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.247386 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.351032 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.351106 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.351131 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.351161 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.351183 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.417067 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.417090 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.417232 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:27 crc kubenswrapper[4672]: E0930 12:23:27.417439 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:27 crc kubenswrapper[4672]: E0930 12:23:27.417599 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:27 crc kubenswrapper[4672]: E0930 12:23:27.417730 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.454937 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.455015 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.455082 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.455113 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.455209 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.558767 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.558823 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.558836 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.558860 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.558875 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.662324 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.662365 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.662376 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.662396 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.662508 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.766165 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.766610 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.766820 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.766973 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.767124 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.870495 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.870538 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.870547 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.870563 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.870580 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.973805 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.973867 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.973886 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.973911 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:27 crc kubenswrapper[4672]: I0930 12:23:27.973929 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:27Z","lastTransitionTime":"2025-09-30T12:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.077226 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.077313 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.077332 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.077357 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.077374 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.180086 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.180132 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.180225 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.180249 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.180354 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.283390 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.283455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.283474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.283499 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.283519 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.386322 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.386389 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.386420 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.386454 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.386472 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.416605 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:28 crc kubenswrapper[4672]: E0930 12:23:28.417004 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.490652 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.490719 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.490737 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.490766 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.490790 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.595562 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.595651 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.595685 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.595723 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.595746 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.698467 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.698525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.698543 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.698566 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.698580 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.801414 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.801474 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.801493 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.801521 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.801540 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.903975 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.904057 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.904080 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.904109 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:28 crc kubenswrapper[4672]: I0930 12:23:28.904132 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:28Z","lastTransitionTime":"2025-09-30T12:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.007384 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.007455 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.007476 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.007503 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.007525 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:29Z","lastTransitionTime":"2025-09-30T12:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.112417 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.112501 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.112525 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.112555 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.112574 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:29Z","lastTransitionTime":"2025-09-30T12:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.215440 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.215522 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.215553 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.215601 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.215622 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:29Z","lastTransitionTime":"2025-09-30T12:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.318172 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.318235 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.318255 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.318312 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.318328 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:29Z","lastTransitionTime":"2025-09-30T12:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.361631 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.361695 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.361708 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.361729 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.361749 4672 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T12:23:29Z","lastTransitionTime":"2025-09-30T12:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.417494 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.417506 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:29 crc kubenswrapper[4672]: E0930 12:23:29.417714 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.417782 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:29 crc kubenswrapper[4672]: E0930 12:23:29.417936 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:29 crc kubenswrapper[4672]: E0930 12:23:29.418168 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.434565 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d"] Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.435070 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.438614 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.438963 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.439246 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.439601 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.458890 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1523a596-5d97-45bb-9c6c-92b24e609a98-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.458982 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1523a596-5d97-45bb-9c6c-92b24e609a98-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.459037 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1523a596-5d97-45bb-9c6c-92b24e609a98-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.459077 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1523a596-5d97-45bb-9c6c-92b24e609a98-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.459191 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1523a596-5d97-45bb-9c6c-92b24e609a98-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.479133 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.479089858 podStartE2EDuration="1m16.479089858s" podCreationTimestamp="2025-09-30 12:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:29.474389028 +0000 UTC m=+100.743626774" watchObservedRunningTime="2025-09-30 12:23:29.479089858 +0000 UTC m=+100.748327554" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.541694 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=50.541663104 podStartE2EDuration="50.541663104s" podCreationTimestamp="2025-09-30 12:22:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:29.540977326 +0000 UTC m=+100.810215022" watchObservedRunningTime="2025-09-30 12:23:29.541663104 +0000 UTC m=+100.810900790" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.542063 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bh5lq" podStartSLOduration=79.542052364 podStartE2EDuration="1m19.542052364s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:29.514159412 +0000 UTC m=+100.783397068" watchObservedRunningTime="2025-09-30 12:23:29.542052364 +0000 UTC m=+100.811290040" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.558595 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.558565755 podStartE2EDuration="26.558565755s" podCreationTimestamp="2025-09-30 12:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:29.557123508 +0000 UTC m=+100.826361194" watchObservedRunningTime="2025-09-30 12:23:29.558565755 +0000 UTC m=+100.827803411" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.559725 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1523a596-5d97-45bb-9c6c-92b24e609a98-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.559771 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1523a596-5d97-45bb-9c6c-92b24e609a98-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.559809 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1523a596-5d97-45bb-9c6c-92b24e609a98-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.559837 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1523a596-5d97-45bb-9c6c-92b24e609a98-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.559905 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1523a596-5d97-45bb-9c6c-92b24e609a98-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.559972 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1523a596-5d97-45bb-9c6c-92b24e609a98-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.560353 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1523a596-5d97-45bb-9c6c-92b24e609a98-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.561257 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1523a596-5d97-45bb-9c6c-92b24e609a98-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.571008 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1523a596-5d97-45bb-9c6c-92b24e609a98-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.580280 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1523a596-5d97-45bb-9c6c-92b24e609a98-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mww8d\" (UID: \"1523a596-5d97-45bb-9c6c-92b24e609a98\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.765202 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" Sep 30 12:23:29 crc kubenswrapper[4672]: W0930 12:23:29.789706 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1523a596_5d97_45bb_9c6c_92b24e609a98.slice/crio-43a362dddc944ef45ab5cc6c08a622ddb6ccca7c767ed1df8bb02cf0591cb6fe WatchSource:0}: Error finding container 43a362dddc944ef45ab5cc6c08a622ddb6ccca7c767ed1df8bb02cf0591cb6fe: Status 404 returned error can't find the container with id 43a362dddc944ef45ab5cc6c08a622ddb6ccca7c767ed1df8bb02cf0591cb6fe Sep 30 12:23:29 crc kubenswrapper[4672]: I0930 12:23:29.964506 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:29 crc kubenswrapper[4672]: E0930 12:23:29.964730 4672 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:23:29 crc kubenswrapper[4672]: E0930 12:23:29.964847 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs podName:42618cd5-d9f9-45ba-8081-660ca47bebf4 nodeName:}" failed. No retries permitted until 2025-09-30 12:24:33.964814838 +0000 UTC m=+165.234052514 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs") pod "network-metrics-daemon-n7wwp" (UID: "42618cd5-d9f9-45ba-8081-660ca47bebf4") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 12:23:30 crc kubenswrapper[4672]: I0930 12:23:30.030517 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" event={"ID":"1523a596-5d97-45bb-9c6c-92b24e609a98","Type":"ContainerStarted","Data":"71687a207e9a97fbfd738a32a53cc440273edd564d896e056e12bdf9a083c24e"} Sep 30 12:23:30 crc kubenswrapper[4672]: I0930 12:23:30.030601 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" event={"ID":"1523a596-5d97-45bb-9c6c-92b24e609a98","Type":"ContainerStarted","Data":"43a362dddc944ef45ab5cc6c08a622ddb6ccca7c767ed1df8bb02cf0591cb6fe"} Sep 30 12:23:30 crc kubenswrapper[4672]: I0930 12:23:30.050722 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mww8d" podStartSLOduration=80.050698389 podStartE2EDuration="1m20.050698389s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:30.04918984 +0000 UTC m=+101.318427516" watchObservedRunningTime="2025-09-30 12:23:30.050698389 +0000 UTC m=+101.319936045" Sep 30 12:23:30 crc kubenswrapper[4672]: I0930 12:23:30.416199 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:30 crc kubenswrapper[4672]: E0930 12:23:30.416683 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:30 crc kubenswrapper[4672]: I0930 12:23:30.418431 4672 scope.go:117] "RemoveContainer" containerID="fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63" Sep 30 12:23:30 crc kubenswrapper[4672]: E0930 12:23:30.419042 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:23:31 crc kubenswrapper[4672]: I0930 12:23:31.416356 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:31 crc kubenswrapper[4672]: I0930 12:23:31.416381 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:31 crc kubenswrapper[4672]: I0930 12:23:31.416389 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:31 crc kubenswrapper[4672]: E0930 12:23:31.417476 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:31 crc kubenswrapper[4672]: E0930 12:23:31.418012 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:31 crc kubenswrapper[4672]: E0930 12:23:31.418082 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:32 crc kubenswrapper[4672]: I0930 12:23:32.416541 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:32 crc kubenswrapper[4672]: E0930 12:23:32.416776 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:33 crc kubenswrapper[4672]: I0930 12:23:33.416340 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:33 crc kubenswrapper[4672]: I0930 12:23:33.416424 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:33 crc kubenswrapper[4672]: I0930 12:23:33.416374 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:33 crc kubenswrapper[4672]: E0930 12:23:33.417236 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:33 crc kubenswrapper[4672]: E0930 12:23:33.417556 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:33 crc kubenswrapper[4672]: E0930 12:23:33.417807 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:34 crc kubenswrapper[4672]: I0930 12:23:34.416336 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:34 crc kubenswrapper[4672]: E0930 12:23:34.416785 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:35 crc kubenswrapper[4672]: I0930 12:23:35.416754 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:35 crc kubenswrapper[4672]: I0930 12:23:35.416805 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:35 crc kubenswrapper[4672]: I0930 12:23:35.416936 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:35 crc kubenswrapper[4672]: E0930 12:23:35.417129 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:35 crc kubenswrapper[4672]: E0930 12:23:35.417374 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:35 crc kubenswrapper[4672]: E0930 12:23:35.417502 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:36 crc kubenswrapper[4672]: I0930 12:23:36.417041 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:36 crc kubenswrapper[4672]: E0930 12:23:36.417235 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:37 crc kubenswrapper[4672]: I0930 12:23:37.416974 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:37 crc kubenswrapper[4672]: I0930 12:23:37.416984 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:37 crc kubenswrapper[4672]: E0930 12:23:37.417184 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:37 crc kubenswrapper[4672]: I0930 12:23:37.417007 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:37 crc kubenswrapper[4672]: E0930 12:23:37.417733 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:37 crc kubenswrapper[4672]: E0930 12:23:37.417916 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:38 crc kubenswrapper[4672]: I0930 12:23:38.416302 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:38 crc kubenswrapper[4672]: E0930 12:23:38.416499 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:39 crc kubenswrapper[4672]: I0930 12:23:39.416856 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:39 crc kubenswrapper[4672]: I0930 12:23:39.416899 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:39 crc kubenswrapper[4672]: E0930 12:23:39.419111 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:39 crc kubenswrapper[4672]: I0930 12:23:39.419160 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:39 crc kubenswrapper[4672]: E0930 12:23:39.419316 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:39 crc kubenswrapper[4672]: E0930 12:23:39.419447 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:40 crc kubenswrapper[4672]: I0930 12:23:40.417010 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:40 crc kubenswrapper[4672]: E0930 12:23:40.417309 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:41 crc kubenswrapper[4672]: I0930 12:23:41.416417 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:41 crc kubenswrapper[4672]: I0930 12:23:41.416601 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:41 crc kubenswrapper[4672]: I0930 12:23:41.416690 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:41 crc kubenswrapper[4672]: E0930 12:23:41.416748 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:41 crc kubenswrapper[4672]: E0930 12:23:41.416854 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:41 crc kubenswrapper[4672]: E0930 12:23:41.416929 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:42 crc kubenswrapper[4672]: I0930 12:23:42.415940 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:42 crc kubenswrapper[4672]: E0930 12:23:42.416136 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:43 crc kubenswrapper[4672]: I0930 12:23:43.416693 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:43 crc kubenswrapper[4672]: I0930 12:23:43.416736 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:43 crc kubenswrapper[4672]: E0930 12:23:43.416827 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:43 crc kubenswrapper[4672]: I0930 12:23:43.416693 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:43 crc kubenswrapper[4672]: E0930 12:23:43.416959 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:43 crc kubenswrapper[4672]: E0930 12:23:43.417083 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:44 crc kubenswrapper[4672]: I0930 12:23:44.416792 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:44 crc kubenswrapper[4672]: E0930 12:23:44.417090 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:44 crc kubenswrapper[4672]: I0930 12:23:44.418509 4672 scope.go:117] "RemoveContainer" containerID="fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63" Sep 30 12:23:44 crc kubenswrapper[4672]: E0930 12:23:44.418831 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nznsk_openshift-ovn-kubernetes(5da59bc9-84da-42f6-86e9-3399ecf31725)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" Sep 30 12:23:45 crc kubenswrapper[4672]: I0930 12:23:45.415994 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:45 crc kubenswrapper[4672]: I0930 12:23:45.416024 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:45 crc kubenswrapper[4672]: I0930 12:23:45.416306 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:45 crc kubenswrapper[4672]: E0930 12:23:45.416455 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:45 crc kubenswrapper[4672]: E0930 12:23:45.416625 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:45 crc kubenswrapper[4672]: E0930 12:23:45.416695 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:46 crc kubenswrapper[4672]: I0930 12:23:46.417125 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:46 crc kubenswrapper[4672]: E0930 12:23:46.417467 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.095617 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/1.log" Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.096557 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/0.log" Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.096683 4672 generic.go:334] "Generic (PLEG): container finished" podID="6806ff3c-ab3a-402e-b1c5-cc37c0810a65" containerID="5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9" exitCode=1 Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.096741 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerDied","Data":"5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9"} Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.096797 4672 scope.go:117] "RemoveContainer" containerID="687d353cc41530a0cbee2c4a782b6ae10e9ee58aae0bfe183537c96851611e2e" Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.098457 4672 scope.go:117] "RemoveContainer" containerID="5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9" Sep 30 12:23:47 crc kubenswrapper[4672]: E0930 12:23:47.102118 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8q82q_openshift-multus(6806ff3c-ab3a-402e-b1c5-cc37c0810a65)\"" pod="openshift-multus/multus-8q82q" podUID="6806ff3c-ab3a-402e-b1c5-cc37c0810a65" Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.416691 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.416829 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:47 crc kubenswrapper[4672]: E0930 12:23:47.416910 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:47 crc kubenswrapper[4672]: I0930 12:23:47.416952 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:47 crc kubenswrapper[4672]: E0930 12:23:47.417062 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:47 crc kubenswrapper[4672]: E0930 12:23:47.417238 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:48 crc kubenswrapper[4672]: I0930 12:23:48.103460 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/1.log" Sep 30 12:23:48 crc kubenswrapper[4672]: I0930 12:23:48.416959 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:48 crc kubenswrapper[4672]: E0930 12:23:48.417141 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:49 crc kubenswrapper[4672]: E0930 12:23:49.353938 4672 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 12:23:49 crc kubenswrapper[4672]: I0930 12:23:49.416303 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:49 crc kubenswrapper[4672]: I0930 12:23:49.416357 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:49 crc kubenswrapper[4672]: I0930 12:23:49.416421 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:49 crc kubenswrapper[4672]: E0930 12:23:49.418673 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:49 crc kubenswrapper[4672]: E0930 12:23:49.418916 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:49 crc kubenswrapper[4672]: E0930 12:23:49.419033 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:49 crc kubenswrapper[4672]: E0930 12:23:49.521255 4672 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 12:23:50 crc kubenswrapper[4672]: I0930 12:23:50.416056 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:50 crc kubenswrapper[4672]: E0930 12:23:50.416293 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:51 crc kubenswrapper[4672]: I0930 12:23:51.415959 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:51 crc kubenswrapper[4672]: E0930 12:23:51.416159 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:51 crc kubenswrapper[4672]: I0930 12:23:51.416321 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:51 crc kubenswrapper[4672]: I0930 12:23:51.416356 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:51 crc kubenswrapper[4672]: E0930 12:23:51.416593 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:51 crc kubenswrapper[4672]: E0930 12:23:51.416687 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:52 crc kubenswrapper[4672]: I0930 12:23:52.416186 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:52 crc kubenswrapper[4672]: E0930 12:23:52.416466 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:53 crc kubenswrapper[4672]: I0930 12:23:53.417059 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:53 crc kubenswrapper[4672]: I0930 12:23:53.417196 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:53 crc kubenswrapper[4672]: E0930 12:23:53.417301 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:53 crc kubenswrapper[4672]: I0930 12:23:53.417386 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:53 crc kubenswrapper[4672]: E0930 12:23:53.417491 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:53 crc kubenswrapper[4672]: E0930 12:23:53.417649 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:54 crc kubenswrapper[4672]: I0930 12:23:54.416899 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:54 crc kubenswrapper[4672]: E0930 12:23:54.417020 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:54 crc kubenswrapper[4672]: E0930 12:23:54.523800 4672 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 12:23:55 crc kubenswrapper[4672]: I0930 12:23:55.417649 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:55 crc kubenswrapper[4672]: I0930 12:23:55.417739 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:55 crc kubenswrapper[4672]: I0930 12:23:55.417820 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:55 crc kubenswrapper[4672]: E0930 12:23:55.418022 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:55 crc kubenswrapper[4672]: E0930 12:23:55.418183 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:55 crc kubenswrapper[4672]: E0930 12:23:55.418352 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:56 crc kubenswrapper[4672]: I0930 12:23:56.416891 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:56 crc kubenswrapper[4672]: E0930 12:23:56.417120 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:56 crc kubenswrapper[4672]: I0930 12:23:56.418226 4672 scope.go:117] "RemoveContainer" containerID="fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63" Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.139131 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/3.log" Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.143097 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerStarted","Data":"6022fd1dd9491a4d7867913e1da93c0ff79f6dd19b56b4a53c65da9f0f690cb0"} Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.143527 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.179447 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podStartSLOduration=107.179420463 podStartE2EDuration="1m47.179420463s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:23:57.17616674 +0000 UTC m=+128.445404396" watchObservedRunningTime="2025-09-30 12:23:57.179420463 +0000 UTC m=+128.448658109" Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.366099 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n7wwp"] Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.366246 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:57 crc kubenswrapper[4672]: E0930 12:23:57.366431 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.416404 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:57 crc kubenswrapper[4672]: I0930 12:23:57.416457 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:57 crc kubenswrapper[4672]: E0930 12:23:57.416582 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:57 crc kubenswrapper[4672]: E0930 12:23:57.416830 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:58 crc kubenswrapper[4672]: I0930 12:23:58.416943 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:23:58 crc kubenswrapper[4672]: E0930 12:23:58.417497 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:23:59 crc kubenswrapper[4672]: I0930 12:23:59.416504 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:23:59 crc kubenswrapper[4672]: I0930 12:23:59.416917 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:23:59 crc kubenswrapper[4672]: I0930 12:23:59.418694 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:23:59 crc kubenswrapper[4672]: E0930 12:23:59.418698 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:23:59 crc kubenswrapper[4672]: E0930 12:23:59.418902 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:23:59 crc kubenswrapper[4672]: E0930 12:23:59.419511 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:23:59 crc kubenswrapper[4672]: E0930 12:23:59.525185 4672 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 12:24:00 crc kubenswrapper[4672]: I0930 12:24:00.416498 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:00 crc kubenswrapper[4672]: E0930 12:24:00.416667 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:24:01 crc kubenswrapper[4672]: I0930 12:24:01.416947 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:24:01 crc kubenswrapper[4672]: I0930 12:24:01.417019 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:24:01 crc kubenswrapper[4672]: E0930 12:24:01.417115 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:24:01 crc kubenswrapper[4672]: E0930 12:24:01.417470 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:24:01 crc kubenswrapper[4672]: I0930 12:24:01.417816 4672 scope.go:117] "RemoveContainer" containerID="5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9" Sep 30 12:24:01 crc kubenswrapper[4672]: I0930 12:24:01.417981 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:01 crc kubenswrapper[4672]: E0930 12:24:01.418215 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:24:02 crc kubenswrapper[4672]: I0930 12:24:02.165707 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/1.log" Sep 30 12:24:02 crc kubenswrapper[4672]: I0930 12:24:02.165844 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerStarted","Data":"c88d61620e022b57cdffca5d4746f467e2a5f011df271eb6002e734d1db576ce"} Sep 30 12:24:02 crc kubenswrapper[4672]: I0930 12:24:02.415950 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:02 crc kubenswrapper[4672]: E0930 12:24:02.416123 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:24:03 crc kubenswrapper[4672]: I0930 12:24:03.416791 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:24:03 crc kubenswrapper[4672]: E0930 12:24:03.417533 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7wwp" podUID="42618cd5-d9f9-45ba-8081-660ca47bebf4" Sep 30 12:24:03 crc kubenswrapper[4672]: I0930 12:24:03.417059 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:24:03 crc kubenswrapper[4672]: E0930 12:24:03.417731 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 12:24:03 crc kubenswrapper[4672]: I0930 12:24:03.417022 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:03 crc kubenswrapper[4672]: E0930 12:24:03.417886 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 12:24:04 crc kubenswrapper[4672]: I0930 12:24:04.416211 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:04 crc kubenswrapper[4672]: E0930 12:24:04.416441 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.416641 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.416651 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.416680 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.420711 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.420967 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.421130 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.421333 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.421375 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 12:24:05 crc kubenswrapper[4672]: I0930 12:24:05.421502 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 12:24:06 crc kubenswrapper[4672]: I0930 12:24:06.415991 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.253782 4672 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.319378 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.319860 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.320641 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kxhzb"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.320955 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.321508 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qrbqf"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.321908 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.328470 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ppzw6"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.329399 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.335072 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.350328 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bj7dq"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.354157 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.354236 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.374859 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.375072 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.375151 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.375244 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.375357 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.375460 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.376179 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.376655 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.376737 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.377105 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.377464 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.378061 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.378915 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379139 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379314 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379529 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379616 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379737 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379787 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379873 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.379896 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.380090 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.380129 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.380320 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.380819 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vs8kw"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.380913 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.381184 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.381625 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rj74q"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.382120 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.382163 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.382170 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.382124 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.385443 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.386118 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.386284 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7555c95-5534-45dc-a212-4262554a0c0b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.386421 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgq5d\" (UniqueName: \"kubernetes.io/projected/a7555c95-5534-45dc-a212-4262554a0c0b-kube-api-access-rgq5d\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.386541 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-audit\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.386659 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-encryption-config\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.386764 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-image-import-ca\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.386900 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-audit-policies\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387005 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20530f49-28c3-4983-8301-cb4275d5a129-audit-dir\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387124 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-serving-cert\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387238 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffw7\" (UniqueName: \"kubernetes.io/projected/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-kube-api-access-8ffw7\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387410 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-etcd-client\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387520 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhzkd\" (UniqueName: \"kubernetes.io/projected/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-kube-api-access-mhzkd\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387634 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-config\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387783 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.387890 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-audit-dir\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388010 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-client-ca\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388131 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-node-pullsecrets\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388253 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-serving-cert\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388434 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-service-ca-bundle\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388546 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388663 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-serving-cert\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388789 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.388899 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-etcd-client\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389011 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7555c95-5534-45dc-a212-4262554a0c0b-config\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389134 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-serving-cert\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389254 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csbmd\" (UniqueName: \"kubernetes.io/projected/bdc37a72-d709-408f-b636-dd62ad023b8d-kube-api-access-csbmd\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389376 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7555c95-5534-45dc-a212-4262554a0c0b-images\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389490 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-config\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389590 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-config\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389852 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-etcd-serving-ca\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.389960 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-client-ca\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.390068 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74z4\" (UniqueName: \"kubernetes.io/projected/20530f49-28c3-4983-8301-cb4275d5a129-kube-api-access-j74z4\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.390173 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.390292 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-encryption-config\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.390406 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-config\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.390506 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdc37a72-d709-408f-b636-dd62ad023b8d-serving-cert\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.390616 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sczd4\" (UniqueName: \"kubernetes.io/projected/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-kube-api-access-sczd4\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.394031 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.394630 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bjdbw"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.395169 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.396457 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dlbv7"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.397177 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dlbv7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.397627 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.397761 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.397845 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.397878 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398068 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398180 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398199 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398251 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-x8stp"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398445 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398566 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398622 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398683 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398753 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398762 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.398852 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.399026 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.399088 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.399146 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.401161 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.401391 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.401707 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.402080 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.402818 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407234 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407379 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407644 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407659 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407713 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407242 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407797 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407246 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407849 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407727 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.407803 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.408022 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.408516 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.408645 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.408722 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.408979 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.409128 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.409249 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.409356 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.409617 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.409774 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.418426 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.419400 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.419724 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.421867 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.427122 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mlrw8"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.451282 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.452135 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.452235 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.452532 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.453669 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.453991 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.455222 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.455756 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.456031 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.456348 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.456571 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.456772 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.456924 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.457032 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.456930 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459053 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459151 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459328 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459424 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459486 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459436 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459543 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459608 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459626 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459646 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459661 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459741 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459860 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.459946 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.460166 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.460794 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-52jss"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.463124 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7l9kk"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.463936 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.464042 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bqdd5"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.464666 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.469488 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.469737 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.469872 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.470096 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.469881 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.470210 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.470398 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.471023 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.472120 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.472519 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.476421 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.478511 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.479195 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.479855 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.481186 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.482228 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.483057 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.484925 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kxhzb"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.486678 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ppzw6"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.489034 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rj74q"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491369 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bjdbw"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491422 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vs8kw"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491740 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-client-ca\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491770 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qls6m\" (UniqueName: \"kubernetes.io/projected/b11da5bc-b91d-4a8e-8839-da0f3989618e-kube-api-access-qls6m\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491794 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j74z4\" (UniqueName: \"kubernetes.io/projected/20530f49-28c3-4983-8301-cb4275d5a129-kube-api-access-j74z4\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491905 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62vsq\" (UniqueName: \"kubernetes.io/projected/4312b92a-c221-4cf3-948b-096a90dd7846-kube-api-access-62vsq\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491925 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4312b92a-c221-4cf3-948b-096a90dd7846-serving-cert\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491943 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491961 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-encryption-config\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491977 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-config\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.491995 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b11da5bc-b91d-4a8e-8839-da0f3989618e-service-ca-bundle\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492013 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sczd4\" (UniqueName: \"kubernetes.io/projected/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-kube-api-access-sczd4\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492027 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdc37a72-d709-408f-b636-dd62ad023b8d-serving-cert\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492045 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hbtj\" (UniqueName: \"kubernetes.io/projected/41a42386-34f1-47ec-85bf-4c81bd9228be-kube-api-access-4hbtj\") pod \"dns-operator-744455d44c-7l9kk\" (UID: \"41a42386-34f1-47ec-85bf-4c81bd9228be\") " pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492064 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwq6l\" (UniqueName: \"kubernetes.io/projected/c4246346-4680-4d7d-a64f-262c987067fd-kube-api-access-mwq6l\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492087 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9gl\" (UniqueName: \"kubernetes.io/projected/04847bd1-7d49-41f9-be74-08033dd1212e-kube-api-access-pr9gl\") pod \"cluster-samples-operator-665b6dd947-gncd7\" (UID: \"04847bd1-7d49-41f9-be74-08033dd1212e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492110 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492138 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7555c95-5534-45dc-a212-4262554a0c0b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492158 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgq5d\" (UniqueName: \"kubernetes.io/projected/a7555c95-5534-45dc-a212-4262554a0c0b-kube-api-access-rgq5d\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492175 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-audit\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492197 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-encryption-config\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492218 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfq2\" (UniqueName: \"kubernetes.io/projected/87529aa1-f650-43ab-a3c2-a41f444c71d1-kube-api-access-6jfq2\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492239 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-audit-policies\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492257 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-image-import-ca\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492288 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20530f49-28c3-4983-8301-cb4275d5a129-audit-dir\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492308 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-serving-cert\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492327 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffw7\" (UniqueName: \"kubernetes.io/projected/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-kube-api-access-8ffw7\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492346 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a57383c-8b99-49a9-adb7-caccbd6b3c12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492374 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-etcd-client\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhzkd\" (UniqueName: \"kubernetes.io/projected/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-kube-api-access-mhzkd\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492413 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-metrics-certs\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492437 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4312b92a-c221-4cf3-948b-096a90dd7846-config\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492455 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2xjz\" (UniqueName: \"kubernetes.io/projected/7fb8bbc0-c63a-4ab5-b454-13682563fe31-kube-api-access-c2xjz\") pod \"downloads-7954f5f757-dlbv7\" (UID: \"7fb8bbc0-c63a-4ab5-b454-13682563fe31\") " pod="openshift-console/downloads-7954f5f757-dlbv7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492475 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-config\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492493 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04847bd1-7d49-41f9-be74-08033dd1212e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gncd7\" (UID: \"04847bd1-7d49-41f9-be74-08033dd1212e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492530 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492548 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-audit-dir\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492590 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-client-ca\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492607 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41a42386-34f1-47ec-85bf-4c81bd9228be-metrics-tls\") pod \"dns-operator-744455d44c-7l9kk\" (UID: \"41a42386-34f1-47ec-85bf-4c81bd9228be\") " pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492631 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4246346-4680-4d7d-a64f-262c987067fd-auth-proxy-config\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492651 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-node-pullsecrets\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492669 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-service-ca-bundle\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492687 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-serving-cert\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492703 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492721 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-default-certificate\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492736 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c4246346-4680-4d7d-a64f-262c987067fd-machine-approver-tls\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492753 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4246346-4680-4d7d-a64f-262c987067fd-config\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492774 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-serving-cert\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492773 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-client-ca\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492794 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87529aa1-f650-43ab-a3c2-a41f444c71d1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492813 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87529aa1-f650-43ab-a3c2-a41f444c71d1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492832 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492850 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a57383c-8b99-49a9-adb7-caccbd6b3c12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492868 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-etcd-client\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492887 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-serving-cert\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.492906 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7555c95-5534-45dc-a212-4262554a0c0b-config\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495770 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7555c95-5534-45dc-a212-4262554a0c0b-images\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495814 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csbmd\" (UniqueName: \"kubernetes.io/projected/bdc37a72-d709-408f-b636-dd62ad023b8d-kube-api-access-csbmd\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495838 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-config\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495862 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-config\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495881 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-stats-auth\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495905 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4312b92a-c221-4cf3-948b-096a90dd7846-trusted-ca\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495930 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-etcd-serving-ca\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.495953 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkkcm\" (UniqueName: \"kubernetes.io/projected/6a57383c-8b99-49a9-adb7-caccbd6b3c12-kube-api-access-kkkcm\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.497964 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.497991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-config\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.498255 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.502035 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.502433 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.503426 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-node-pullsecrets\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.503893 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.508288 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.508655 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-encryption-config\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.509069 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7555c95-5534-45dc-a212-4262554a0c0b-config\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.509432 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.509777 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a7555c95-5534-45dc-a212-4262554a0c0b-images\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.510318 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.510443 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-audit-policies\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.511485 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.511693 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.511746 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-config\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.512162 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-config\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.512197 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-audit\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.503902 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-service-ca-bundle\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.512562 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.513691 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-etcd-serving-ca\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.513950 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-audit-dir\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.514382 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-encryption-config\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.514937 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdc37a72-d709-408f-b636-dd62ad023b8d-serving-cert\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.515013 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/20530f49-28c3-4983-8301-cb4275d5a129-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.515165 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20530f49-28c3-4983-8301-cb4275d5a129-audit-dir\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.516131 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-serving-cert\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.517013 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-config\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.517295 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-serving-cert\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.517432 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.517052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-client-ca\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.517954 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-serving-cert\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.518040 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.519353 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-image-import-ca\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.519650 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7555c95-5534-45dc-a212-4262554a0c0b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.519835 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/20530f49-28c3-4983-8301-cb4275d5a129-etcd-client\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.519973 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-serving-cert\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.522430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-etcd-client\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.529083 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.529508 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.530904 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.531054 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.534368 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.539650 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.540612 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.542437 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xzjdp"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.543346 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.543475 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.544158 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.544242 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.544816 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.545423 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.549344 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svmxm"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.550289 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.551663 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.552907 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.554428 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.555156 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.555709 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.556740 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.557841 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qrbqf"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.560349 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-x8stp"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.561419 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nb7b9"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.562020 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.562589 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.564001 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dwqnn"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.564564 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.564753 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.565440 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.568645 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.569340 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.569428 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.570604 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mlrw8"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.571916 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bj7dq"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.573399 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.574662 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.575818 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dlbv7"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.576874 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.579096 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xzjdp"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.580739 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bqdd5"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.581895 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.582892 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.583672 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dwqnn"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.584748 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.585801 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.586742 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.588830 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nb7b9"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.589883 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7l9kk"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.590861 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wzf9f"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.591623 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.591871 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z2tt9"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.592599 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.592999 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.594203 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svmxm"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.595706 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.596757 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a57383c-8b99-49a9-adb7-caccbd6b3c12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.596840 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-metrics-certs\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.596883 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2xjz\" (UniqueName: \"kubernetes.io/projected/7fb8bbc0-c63a-4ab5-b454-13682563fe31-kube-api-access-c2xjz\") pod \"downloads-7954f5f757-dlbv7\" (UID: \"7fb8bbc0-c63a-4ab5-b454-13682563fe31\") " pod="openshift-console/downloads-7954f5f757-dlbv7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.596953 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4312b92a-c221-4cf3-948b-096a90dd7846-config\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.597581 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.597778 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a57383c-8b99-49a9-adb7-caccbd6b3c12-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.598138 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4312b92a-c221-4cf3-948b-096a90dd7846-config\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.598249 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04847bd1-7d49-41f9-be74-08033dd1212e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gncd7\" (UID: \"04847bd1-7d49-41f9-be74-08033dd1212e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.599841 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41a42386-34f1-47ec-85bf-4c81bd9228be-metrics-tls\") pod \"dns-operator-744455d44c-7l9kk\" (UID: \"41a42386-34f1-47ec-85bf-4c81bd9228be\") " pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.599940 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4246346-4680-4d7d-a64f-262c987067fd-auth-proxy-config\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.599977 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87529aa1-f650-43ab-a3c2-a41f444c71d1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600027 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-default-certificate\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600056 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c4246346-4680-4d7d-a64f-262c987067fd-machine-approver-tls\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600112 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4246346-4680-4d7d-a64f-262c987067fd-config\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600179 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87529aa1-f650-43ab-a3c2-a41f444c71d1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600207 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a57383c-8b99-49a9-adb7-caccbd6b3c12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600235 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-stats-auth\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600255 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4312b92a-c221-4cf3-948b-096a90dd7846-trusted-ca\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600328 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkkcm\" (UniqueName: \"kubernetes.io/projected/6a57383c-8b99-49a9-adb7-caccbd6b3c12-kube-api-access-kkkcm\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600364 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qls6m\" (UniqueName: \"kubernetes.io/projected/b11da5bc-b91d-4a8e-8839-da0f3989618e-kube-api-access-qls6m\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600408 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4312b92a-c221-4cf3-948b-096a90dd7846-serving-cert\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600480 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62vsq\" (UniqueName: \"kubernetes.io/projected/4312b92a-c221-4cf3-948b-096a90dd7846-kube-api-access-62vsq\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600512 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b11da5bc-b91d-4a8e-8839-da0f3989618e-service-ca-bundle\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.601212 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.601235 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c4246346-4680-4d7d-a64f-262c987067fd-auth-proxy-config\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.601659 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87529aa1-f650-43ab-a3c2-a41f444c71d1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.600813 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hbtj\" (UniqueName: \"kubernetes.io/projected/41a42386-34f1-47ec-85bf-4c81bd9228be-kube-api-access-4hbtj\") pod \"dns-operator-744455d44c-7l9kk\" (UID: \"41a42386-34f1-47ec-85bf-4c81bd9228be\") " pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.602170 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwq6l\" (UniqueName: \"kubernetes.io/projected/c4246346-4680-4d7d-a64f-262c987067fd-kube-api-access-mwq6l\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.602228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9gl\" (UniqueName: \"kubernetes.io/projected/04847bd1-7d49-41f9-be74-08033dd1212e-kube-api-access-pr9gl\") pod \"cluster-samples-operator-665b6dd947-gncd7\" (UID: \"04847bd1-7d49-41f9-be74-08033dd1212e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.602303 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfq2\" (UniqueName: \"kubernetes.io/projected/87529aa1-f650-43ab-a3c2-a41f444c71d1-kube-api-access-6jfq2\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.602826 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4246346-4680-4d7d-a64f-262c987067fd-config\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.605498 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87529aa1-f650-43ab-a3c2-a41f444c71d1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.605515 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4312b92a-c221-4cf3-948b-096a90dd7846-serving-cert\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.606503 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.608719 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4312b92a-c221-4cf3-948b-096a90dd7846-trusted-ca\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.612041 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.612253 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.613746 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c4246346-4680-4d7d-a64f-262c987067fd-machine-approver-tls\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.614676 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z2tt9"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.615633 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a57383c-8b99-49a9-adb7-caccbd6b3c12-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.615704 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04847bd1-7d49-41f9-be74-08033dd1212e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gncd7\" (UID: \"04847bd1-7d49-41f9-be74-08033dd1212e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.618640 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.621257 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.623760 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.625056 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.626684 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r4gx9"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.628227 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.629815 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r4gx9"] Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.633405 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41a42386-34f1-47ec-85bf-4c81bd9228be-metrics-tls\") pod \"dns-operator-744455d44c-7l9kk\" (UID: \"41a42386-34f1-47ec-85bf-4c81bd9228be\") " pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.641592 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.662146 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.682513 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.703478 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.714926 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-default-certificate\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.722312 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.733687 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-metrics-certs\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.743610 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.759052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b11da5bc-b91d-4a8e-8839-da0f3989618e-stats-auth\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.764087 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.782649 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.784291 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b11da5bc-b91d-4a8e-8839-da0f3989618e-service-ca-bundle\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.803171 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.822474 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.842063 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.862507 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.881800 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.902821 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.922867 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.942044 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 12:24:10 crc kubenswrapper[4672]: I0930 12:24:10.961622 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.002689 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.022235 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.043521 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.062992 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.082667 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.102781 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.130302 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.141442 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.162380 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.203791 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sczd4\" (UniqueName: \"kubernetes.io/projected/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-kube-api-access-sczd4\") pod \"route-controller-manager-6576b87f9c-675r8\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.221715 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.224854 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74z4\" (UniqueName: \"kubernetes.io/projected/20530f49-28c3-4983-8301-cb4275d5a129-kube-api-access-j74z4\") pod \"apiserver-7bbb656c7d-zgl84\" (UID: \"20530f49-28c3-4983-8301-cb4275d5a129\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.243158 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.263335 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.283119 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.296930 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csbmd\" (UniqueName: \"kubernetes.io/projected/bdc37a72-d709-408f-b636-dd62ad023b8d-kube-api-access-csbmd\") pod \"controller-manager-879f6c89f-kxhzb\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.303612 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.304886 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.322927 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.373255 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgq5d\" (UniqueName: \"kubernetes.io/projected/a7555c95-5534-45dc-a212-4262554a0c0b-kube-api-access-rgq5d\") pod \"machine-api-operator-5694c8668f-qrbqf\" (UID: \"a7555c95-5534-45dc-a212-4262554a0c0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.375999 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.383842 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.386558 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhzkd\" (UniqueName: \"kubernetes.io/projected/afcb62e7-0ff7-4af1-bebc-f32bfc50d94a-kube-api-access-mhzkd\") pod \"apiserver-76f77b778f-bj7dq\" (UID: \"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a\") " pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.393023 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.402779 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.424938 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.446472 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.466984 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.483873 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.503436 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.520213 4672 request.go:700] Waited for 1.006520006s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.521809 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.538300 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kxhzb"] Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.541902 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.570311 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8"] Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.582715 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffw7\" (UniqueName: \"kubernetes.io/projected/5506efd0-d2d7-4db7-a8ed-6b47313b59a2-kube-api-access-8ffw7\") pod \"authentication-operator-69f744f599-ppzw6\" (UID: \"5506efd0-d2d7-4db7-a8ed-6b47313b59a2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.582857 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.603574 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.604477 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84"] Sep 30 12:24:11 crc kubenswrapper[4672]: W0930 12:24:11.616656 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20530f49_28c3_4983_8301_cb4275d5a129.slice/crio-2e89c17ac21708774537bf8e097a0adb4988c9432bffdabcd75ee4f71fd68dd7 WatchSource:0}: Error finding container 2e89c17ac21708774537bf8e097a0adb4988c9432bffdabcd75ee4f71fd68dd7: Status 404 returned error can't find the container with id 2e89c17ac21708774537bf8e097a0adb4988c9432bffdabcd75ee4f71fd68dd7 Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.623166 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.629254 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.632324 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bj7dq"] Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.643039 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.663023 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.664219 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.690501 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.701878 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.722987 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.742461 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.763650 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.786130 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.804807 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.813240 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qrbqf"] Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.822788 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.851883 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.864065 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ppzw6"] Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.864454 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.884388 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.902704 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.922151 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.942526 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.962489 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 12:24:11 crc kubenswrapper[4672]: I0930 12:24:11.982852 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.002734 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.022790 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.046861 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.061742 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.083330 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.101798 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.123812 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.141965 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.163131 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.182509 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.202084 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.205515 4672 generic.go:334] "Generic (PLEG): container finished" podID="20530f49-28c3-4983-8301-cb4275d5a129" containerID="cd4de26763711a6726397a83e10e4c50230fec29a54593dd36f5d7575e3e3028" exitCode=0 Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.205674 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" event={"ID":"20530f49-28c3-4983-8301-cb4275d5a129","Type":"ContainerDied","Data":"cd4de26763711a6726397a83e10e4c50230fec29a54593dd36f5d7575e3e3028"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.205722 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" event={"ID":"20530f49-28c3-4983-8301-cb4275d5a129","Type":"ContainerStarted","Data":"2e89c17ac21708774537bf8e097a0adb4988c9432bffdabcd75ee4f71fd68dd7"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.207554 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" event={"ID":"bdc37a72-d709-408f-b636-dd62ad023b8d","Type":"ContainerStarted","Data":"ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.207577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" event={"ID":"bdc37a72-d709-408f-b636-dd62ad023b8d","Type":"ContainerStarted","Data":"4a8e35f787608e9704e4a2ca0691ff903df2d4dcb6702b4939733ec92e6c144d"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.208076 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.209341 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" event={"ID":"01f3df1a-e96a-4e9d-9af8-334a144d7cc4","Type":"ContainerStarted","Data":"d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.209399 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" event={"ID":"01f3df1a-e96a-4e9d-9af8-334a144d7cc4","Type":"ContainerStarted","Data":"8bf35f955afd73191dc5458efc832caf0d36336c31b956c6a6b9799e4338c4dc"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.209577 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.210694 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" event={"ID":"5506efd0-d2d7-4db7-a8ed-6b47313b59a2","Type":"ContainerStarted","Data":"8e7e5fac9bae5493944e83225e6d8dbec0839c5990a63599e8c0dbd34f5bfcaa"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.210730 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" event={"ID":"5506efd0-d2d7-4db7-a8ed-6b47313b59a2","Type":"ContainerStarted","Data":"7c1f63d17bbcffc31b9a640241b28a15b8710630a5ff7d0d9a34ca5459dc7f3b"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.211530 4672 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kxhzb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.211577 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" podUID="bdc37a72-d709-408f-b636-dd62ad023b8d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.211817 4672 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-675r8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.211844 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" podUID="01f3df1a-e96a-4e9d-9af8-334a144d7cc4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.211948 4672 generic.go:334] "Generic (PLEG): container finished" podID="afcb62e7-0ff7-4af1-bebc-f32bfc50d94a" containerID="608b82697068079d4909ded82e535e221ed4371447b7049e8d16ff37f579c803" exitCode=0 Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.212105 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" event={"ID":"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a","Type":"ContainerDied","Data":"608b82697068079d4909ded82e535e221ed4371447b7049e8d16ff37f579c803"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.212126 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" event={"ID":"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a","Type":"ContainerStarted","Data":"db7f98069438b24df55085cf83dde26a7cc91a95511c015af5dec350e2f18224"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.214413 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" event={"ID":"a7555c95-5534-45dc-a212-4262554a0c0b","Type":"ContainerStarted","Data":"69e43398e53926f5bd42d2842b59beb0d4832eda7c839e25d4a07469fcfa32c1"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.214454 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" event={"ID":"a7555c95-5534-45dc-a212-4262554a0c0b","Type":"ContainerStarted","Data":"cca97cdc137e9bb2055bdb44e8c4436abf8f12ad64cc1fccfec2362c51955eb6"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.214467 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" event={"ID":"a7555c95-5534-45dc-a212-4262554a0c0b","Type":"ContainerStarted","Data":"d735e2bf03250fcdd314e0785bb820f86151ece69964bb387b0fece52bf9c37e"} Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.223813 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.246885 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.261711 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.282776 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.303752 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.342800 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.362152 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.382234 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.402728 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.421910 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.443038 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.485749 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2xjz\" (UniqueName: \"kubernetes.io/projected/7fb8bbc0-c63a-4ab5-b454-13682563fe31-kube-api-access-c2xjz\") pod \"downloads-7954f5f757-dlbv7\" (UID: \"7fb8bbc0-c63a-4ab5-b454-13682563fe31\") " pod="openshift-console/downloads-7954f5f757-dlbv7" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.501455 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qls6m\" (UniqueName: \"kubernetes.io/projected/b11da5bc-b91d-4a8e-8839-da0f3989618e-kube-api-access-qls6m\") pod \"router-default-5444994796-52jss\" (UID: \"b11da5bc-b91d-4a8e-8839-da0f3989618e\") " pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.520875 4672 request.go:700] Waited for 1.918028599s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/serviceaccounts/kube-storage-version-migrator-operator/token Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.523384 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62vsq\" (UniqueName: \"kubernetes.io/projected/4312b92a-c221-4cf3-948b-096a90dd7846-kube-api-access-62vsq\") pod \"console-operator-58897d9998-bjdbw\" (UID: \"4312b92a-c221-4cf3-948b-096a90dd7846\") " pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.537347 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfq2\" (UniqueName: \"kubernetes.io/projected/87529aa1-f650-43ab-a3c2-a41f444c71d1-kube-api-access-6jfq2\") pod \"kube-storage-version-migrator-operator-b67b599dd-tlgqg\" (UID: \"87529aa1-f650-43ab-a3c2-a41f444c71d1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.561028 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hbtj\" (UniqueName: \"kubernetes.io/projected/41a42386-34f1-47ec-85bf-4c81bd9228be-kube-api-access-4hbtj\") pod \"dns-operator-744455d44c-7l9kk\" (UID: \"41a42386-34f1-47ec-85bf-4c81bd9228be\") " pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.578765 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkkcm\" (UniqueName: \"kubernetes.io/projected/6a57383c-8b99-49a9-adb7-caccbd6b3c12-kube-api-access-kkkcm\") pod \"openshift-apiserver-operator-796bbdcf4f-jdshz\" (UID: \"6a57383c-8b99-49a9-adb7-caccbd6b3c12\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.585574 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.604606 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwq6l\" (UniqueName: \"kubernetes.io/projected/c4246346-4680-4d7d-a64f-262c987067fd-kube-api-access-mwq6l\") pod \"machine-approver-56656f9798-vr2sj\" (UID: \"c4246346-4680-4d7d-a64f-262c987067fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.617786 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9gl\" (UniqueName: \"kubernetes.io/projected/04847bd1-7d49-41f9-be74-08033dd1212e-kube-api-access-pr9gl\") pod \"cluster-samples-operator-665b6dd947-gncd7\" (UID: \"04847bd1-7d49-41f9-be74-08033dd1212e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.622493 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.642440 4672 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.662756 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.669539 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.673500 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.681798 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.688474 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dlbv7" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.706591 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.734114 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.750932 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.754733 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-ca\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.754813 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-oauth-config\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.754988 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-policies\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755049 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755138 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755180 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755226 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-serving-cert\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755297 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755327 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755350 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-oauth-serving-cert\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.755472 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-service-ca\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.756685 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.756722 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbpj8\" (UniqueName: \"kubernetes.io/projected/551a978e-b9ac-46c6-ad40-b2ed5b6121da-kube-api-access-rbpj8\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.756771 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b5d17b1a-138e-4832-a374-8cdd0c028821-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.756798 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: E0930 12:24:12.756981 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.256940625 +0000 UTC m=+144.526178271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757058 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-trusted-ca\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757123 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-tls\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757212 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d17b1a-138e-4832-a374-8cdd0c028821-serving-cert\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757253 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-trusted-ca-bundle\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757392 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757431 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drvb8\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-kube-api-access-drvb8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757468 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757494 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757545 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bd590e-1ef9-47f4-874d-c5324309ebfd-serving-cert\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757568 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-client\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757649 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-certificates\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757710 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757734 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzrh\" (UniqueName: \"kubernetes.io/projected/b1bca969-46a3-462a-b129-508c3f6752a0-kube-api-access-lwzrh\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txjmp\" (UniqueName: \"kubernetes.io/projected/b9bd590e-1ef9-47f4-874d-c5324309ebfd-kube-api-access-txjmp\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757787 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcwv4\" (UniqueName: \"kubernetes.io/projected/b5d17b1a-138e-4832-a374-8cdd0c028821-kube-api-access-jcwv4\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757871 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757933 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7stp\" (UniqueName: \"kubernetes.io/projected/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-kube-api-access-k7stp\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.757978 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-config\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758014 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758038 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1bca969-46a3-462a-b129-508c3f6752a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758085 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-bound-sa-token\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758107 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-config\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758140 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-dir\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758436 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bca969-46a3-462a-b129-508c3f6752a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758534 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.758632 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-service-ca\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.860843 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861095 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-oauth-serving-cert\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861127 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-service-ca\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861156 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82540b00-caaa-4453-876e-aa214f6f7d34-proxy-tls\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861222 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: E0930 12:24:12.861281 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.361221156 +0000 UTC m=+144.630458992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861433 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b5d17b1a-138e-4832-a374-8cdd0c028821-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861466 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdnn7\" (UniqueName: \"kubernetes.io/projected/82f58782-5e5b-41f9-be22-0d3f1ab4423d-kube-api-access-cdnn7\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861487 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d92f349-809a-4fdc-9006-420e736d856d-webhook-cert\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861511 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwjzj\" (UniqueName: \"kubernetes.io/projected/076cdbe0-669c-4655-b2d1-8967456a6e62-kube-api-access-kwjzj\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861563 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82f58782-5e5b-41f9-be22-0d3f1ab4423d-metrics-tls\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861630 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d17b1a-138e-4832-a374-8cdd0c028821-serving-cert\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861722 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861820 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2acf05e1-f152-4432-b0b2-a44b242d0308-config-volume\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861868 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzrh\" (UniqueName: \"kubernetes.io/projected/b1bca969-46a3-462a-b129-508c3f6752a0-kube-api-access-lwzrh\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861897 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65d4a91-5adb-4ab2-af99-757daad56bd4-config\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861942 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcwv4\" (UniqueName: \"kubernetes.io/projected/b5d17b1a-138e-4832-a374-8cdd0c028821-kube-api-access-jcwv4\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.861965 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862067 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7stp\" (UniqueName: \"kubernetes.io/projected/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-kube-api-access-k7stp\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862180 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862220 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc65b\" (UniqueName: \"kubernetes.io/projected/4be28b87-7df0-4dc6-8e22-de8348686347-kube-api-access-sc65b\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862247 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2acf05e1-f152-4432-b0b2-a44b242d0308-secret-volume\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862374 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c1a3995-a406-4562-b62e-682408dd877e-profile-collector-cert\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862399 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqz8\" (UniqueName: \"kubernetes.io/projected/62422e21-39c6-4772-8f59-33be3d16c368-kube-api-access-6hqz8\") pod \"control-plane-machine-set-operator-78cbb6b69f-dljhp\" (UID: \"62422e21-39c6-4772-8f59-33be3d16c368\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862429 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlms7\" (UniqueName: \"kubernetes.io/projected/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-kube-api-access-qlms7\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862458 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1bca969-46a3-462a-b129-508c3f6752a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862479 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1-cert\") pod \"ingress-canary-dwqnn\" (UID: \"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1\") " pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862563 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-dir\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862612 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bca969-46a3-462a-b129-508c3f6752a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862638 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-service-ca\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862667 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040f9564-cad2-48c5-b4d6-69b1add18a78-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862700 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e89761d-ea2a-4711-94fa-54bce852e7e3-serving-cert\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862727 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-srv-cert\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862745 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-config-volume\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862764 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vstdz\" (UniqueName: \"kubernetes.io/projected/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-kube-api-access-vstdz\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862780 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9869a0d1-03fd-4b72-9519-16c0ef0c4bd9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xzjdp\" (UID: \"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862804 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862823 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65d4a91-5adb-4ab2-af99-757daad56bd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862858 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4mf\" (UniqueName: \"kubernetes.io/projected/f4d96698-0412-4923-9a10-03b174e0ca6c-kube-api-access-9c4mf\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862888 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.862964 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c1a3995-a406-4562-b62e-682408dd877e-srv-cert\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863006 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f58782-5e5b-41f9-be22-0d3f1ab4423d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863027 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/040f9564-cad2-48c5-b4d6-69b1add18a78-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863053 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-config\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863074 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-serving-cert\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863104 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863124 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863168 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fdnf\" (UniqueName: \"kubernetes.io/projected/82540b00-caaa-4453-876e-aa214f6f7d34-kube-api-access-9fdnf\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863191 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbpj8\" (UniqueName: \"kubernetes.io/projected/551a978e-b9ac-46c6-ad40-b2ed5b6121da-kube-api-access-rbpj8\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863209 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvkp\" (UniqueName: \"kubernetes.io/projected/f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1-kube-api-access-nsvkp\") pod \"ingress-canary-dwqnn\" (UID: \"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1\") " pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863231 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863249 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-socket-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863360 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-trusted-ca\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863380 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863426 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-tls\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863453 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f58782-5e5b-41f9-be22-0d3f1ab4423d-trusted-ca\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863470 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040f9564-cad2-48c5-b4d6-69b1add18a78-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863502 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-trusted-ca-bundle\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863523 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863541 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863571 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drvb8\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-kube-api-access-drvb8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863589 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863609 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863630 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/82540b00-caaa-4453-876e-aa214f6f7d34-images\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863647 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4d4\" (UniqueName: \"kubernetes.io/projected/9869a0d1-03fd-4b72-9519-16c0ef0c4bd9-kube-api-access-df4d4\") pod \"multus-admission-controller-857f4d67dd-xzjdp\" (UID: \"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.863890 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864285 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gw6\" (UniqueName: \"kubernetes.io/projected/3c1a3995-a406-4562-b62e-682408dd877e-kube-api-access-f7gw6\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864310 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bd590e-1ef9-47f4-874d-c5324309ebfd-serving-cert\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864335 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-client\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864394 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-certificates\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864435 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864456 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-csi-data-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864475 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txjmp\" (UniqueName: \"kubernetes.io/projected/b9bd590e-1ef9-47f4-874d-c5324309ebfd-kube-api-access-txjmp\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864493 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82540b00-caaa-4453-876e-aa214f6f7d34-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864511 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-proxy-tls\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864540 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psb4b\" (UniqueName: \"kubernetes.io/projected/3d92f349-809a-4fdc-9006-420e736d856d-kube-api-access-psb4b\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864558 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c11c529-6073-4d85-b790-8dc781625217-signing-cabundle\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864604 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d92f349-809a-4fdc-9006-420e736d856d-apiservice-cert\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864625 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-config\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864643 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3d92f349-809a-4fdc-9006-420e736d856d-tmpfs\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864665 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4be28b87-7df0-4dc6-8e22-de8348686347-certs\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864683 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77x7\" (UniqueName: \"kubernetes.io/projected/6e9532cf-efc9-413a-b8f0-a9186a4c74c8-kube-api-access-s77x7\") pod \"package-server-manager-789f6589d5-84vwh\" (UID: \"6e9532cf-efc9-413a-b8f0-a9186a4c74c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864703 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-bound-sa-token\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864721 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-config\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.864740 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-registration-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865012 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljmdf\" (UniqueName: \"kubernetes.io/projected/6c11c529-6073-4d85-b790-8dc781625217-kube-api-access-ljmdf\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865034 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h5w6\" (UniqueName: \"kubernetes.io/projected/2acf05e1-f152-4432-b0b2-a44b242d0308-kube-api-access-4h5w6\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865055 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65d4a91-5adb-4ab2-af99-757daad56bd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865072 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c11c529-6073-4d85-b790-8dc781625217-signing-key\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865089 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e9532cf-efc9-413a-b8f0-a9186a4c74c8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-84vwh\" (UID: \"6e9532cf-efc9-413a-b8f0-a9186a4c74c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865132 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62422e21-39c6-4772-8f59-33be3d16c368-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dljhp\" (UID: \"62422e21-39c6-4772-8f59-33be3d16c368\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865163 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865181 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-mountpoint-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865210 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865232 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-oauth-config\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865251 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-ca\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865286 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865319 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gbdl\" (UniqueName: \"kubernetes.io/projected/cba13ebb-13fa-4059-8d32-295a92eb0cdb-kube-api-access-5gbdl\") pod \"migrator-59844c95c7-nq6b6\" (UID: \"cba13ebb-13fa-4059-8d32-295a92eb0cdb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865355 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-policies\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865406 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e89761d-ea2a-4711-94fa-54bce852e7e3-config\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865423 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-plugins-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865456 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865474 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-metrics-tls\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865491 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vlb\" (UniqueName: \"kubernetes.io/projected/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-kube-api-access-n8vlb\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865507 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865555 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865572 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zl2b\" (UniqueName: \"kubernetes.io/projected/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-kube-api-access-8zl2b\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865587 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqddq\" (UniqueName: \"kubernetes.io/projected/3e89761d-ea2a-4711-94fa-54bce852e7e3-kube-api-access-lqddq\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865624 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.865647 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4be28b87-7df0-4dc6-8e22-de8348686347-node-bootstrap-token\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.867120 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.867138 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-trusted-ca\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.867475 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.868927 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.870762 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-tls\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.871441 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-dir\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.871546 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-certificates\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.872417 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-ca\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: E0930 12:24:12.872678 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.372657936 +0000 UTC m=+144.641895572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.873251 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-service-ca\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.874712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1bca969-46a3-462a-b129-508c3f6752a0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.874949 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d17b1a-138e-4832-a374-8cdd0c028821-serving-cert\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.875026 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9bd590e-1ef9-47f4-874d-c5324309ebfd-config\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.876163 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9bd590e-1ef9-47f4-874d-c5324309ebfd-serving-cert\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.876926 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1bca969-46a3-462a-b129-508c3f6752a0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878016 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-policies\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878020 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878292 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878389 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878399 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878653 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b9bd590e-1ef9-47f4-874d-c5324309ebfd-etcd-client\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878754 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.878784 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.879126 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b5d17b1a-138e-4832-a374-8cdd0c028821-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.900673 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzrh\" (UniqueName: \"kubernetes.io/projected/b1bca969-46a3-462a-b129-508c3f6752a0-kube-api-access-lwzrh\") pod \"openshift-controller-manager-operator-756b6f6bc6-wfvzx\" (UID: \"b1bca969-46a3-462a-b129-508c3f6752a0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.907912 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-oauth-serving-cert\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.909619 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-config\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.914838 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.915050 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.915180 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-serving-cert\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.916478 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-trusted-ca-bundle\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.916870 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-oauth-config\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.918906 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-service-ca\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.925383 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbpj8\" (UniqueName: \"kubernetes.io/projected/551a978e-b9ac-46c6-ad40-b2ed5b6121da-kube-api-access-rbpj8\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.936179 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vs8kw\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.955647 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7l9kk"] Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.960301 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.964751 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drvb8\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-kube-api-access-drvb8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.969781 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:12 crc kubenswrapper[4672]: E0930 12:24:12.972737 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.472683364 +0000 UTC m=+144.741921020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.974896 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zl2b\" (UniqueName: \"kubernetes.io/projected/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-kube-api-access-8zl2b\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.974942 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqddq\" (UniqueName: \"kubernetes.io/projected/3e89761d-ea2a-4711-94fa-54bce852e7e3-kube-api-access-lqddq\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.974988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975032 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4be28b87-7df0-4dc6-8e22-de8348686347-node-bootstrap-token\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975061 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82540b00-caaa-4453-876e-aa214f6f7d34-proxy-tls\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975094 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdnn7\" (UniqueName: \"kubernetes.io/projected/82f58782-5e5b-41f9-be22-0d3f1ab4423d-kube-api-access-cdnn7\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975114 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d92f349-809a-4fdc-9006-420e736d856d-webhook-cert\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975133 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwjzj\" (UniqueName: \"kubernetes.io/projected/076cdbe0-669c-4655-b2d1-8967456a6e62-kube-api-access-kwjzj\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975158 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82f58782-5e5b-41f9-be22-0d3f1ab4423d-metrics-tls\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975193 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975218 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2acf05e1-f152-4432-b0b2-a44b242d0308-config-volume\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975249 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65d4a91-5adb-4ab2-af99-757daad56bd4-config\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975327 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc65b\" (UniqueName: \"kubernetes.io/projected/4be28b87-7df0-4dc6-8e22-de8348686347-kube-api-access-sc65b\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975348 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2acf05e1-f152-4432-b0b2-a44b242d0308-secret-volume\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975382 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c1a3995-a406-4562-b62e-682408dd877e-profile-collector-cert\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975409 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqz8\" (UniqueName: \"kubernetes.io/projected/62422e21-39c6-4772-8f59-33be3d16c368-kube-api-access-6hqz8\") pod \"control-plane-machine-set-operator-78cbb6b69f-dljhp\" (UID: \"62422e21-39c6-4772-8f59-33be3d16c368\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975439 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlms7\" (UniqueName: \"kubernetes.io/projected/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-kube-api-access-qlms7\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975472 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1-cert\") pod \"ingress-canary-dwqnn\" (UID: \"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1\") " pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975509 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040f9564-cad2-48c5-b4d6-69b1add18a78-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975537 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e89761d-ea2a-4711-94fa-54bce852e7e3-serving-cert\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975570 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-srv-cert\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975595 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-config-volume\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975616 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vstdz\" (UniqueName: \"kubernetes.io/projected/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-kube-api-access-vstdz\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975636 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9869a0d1-03fd-4b72-9519-16c0ef0c4bd9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xzjdp\" (UID: \"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.975661 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65d4a91-5adb-4ab2-af99-757daad56bd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.976439 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65d4a91-5adb-4ab2-af99-757daad56bd4-config\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.977924 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4mf\" (UniqueName: \"kubernetes.io/projected/f4d96698-0412-4923-9a10-03b174e0ca6c-kube-api-access-9c4mf\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.977998 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2acf05e1-f152-4432-b0b2-a44b242d0308-config-volume\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.980095 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040f9564-cad2-48c5-b4d6-69b1add18a78-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:12 crc kubenswrapper[4672]: E0930 12:24:12.986965 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.486940955 +0000 UTC m=+144.756178591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.988158 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcwv4\" (UniqueName: \"kubernetes.io/projected/b5d17b1a-138e-4832-a374-8cdd0c028821-kube-api-access-jcwv4\") pod \"openshift-config-operator-7777fb866f-rj74q\" (UID: \"b5d17b1a-138e-4832-a374-8cdd0c028821\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.988683 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d92f349-809a-4fdc-9006-420e736d856d-webhook-cert\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.988773 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.989006 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c1a3995-a406-4562-b62e-682408dd877e-srv-cert\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.989053 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f58782-5e5b-41f9-be22-0d3f1ab4423d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.989182 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/040f9564-cad2-48c5-b4d6-69b1add18a78-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.989207 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-config\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.989251 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.995696 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c1a3995-a406-4562-b62e-682408dd877e-srv-cert\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.992228 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c1a3995-a406-4562-b62e-682408dd877e-profile-collector-cert\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.993117 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82f58782-5e5b-41f9-be22-0d3f1ab4423d-metrics-tls\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.994892 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82540b00-caaa-4453-876e-aa214f6f7d34-proxy-tls\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.995080 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e89761d-ea2a-4711-94fa-54bce852e7e3-serving-cert\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.995115 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9869a0d1-03fd-4b72-9519-16c0ef0c4bd9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xzjdp\" (UID: \"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.995671 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-config\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.995750 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fdnf\" (UniqueName: \"kubernetes.io/projected/82540b00-caaa-4453-876e-aa214f6f7d34-kube-api-access-9fdnf\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996367 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvkp\" (UniqueName: \"kubernetes.io/projected/f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1-kube-api-access-nsvkp\") pod \"ingress-canary-dwqnn\" (UID: \"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1\") " pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996400 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-socket-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996425 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996452 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f58782-5e5b-41f9-be22-0d3f1ab4423d-trusted-ca\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996473 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040f9564-cad2-48c5-b4d6-69b1add18a78-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996497 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996523 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996545 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/82540b00-caaa-4453-876e-aa214f6f7d34-images\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996568 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4d4\" (UniqueName: \"kubernetes.io/projected/9869a0d1-03fd-4b72-9519-16c0ef0c4bd9-kube-api-access-df4d4\") pod \"multus-admission-controller-857f4d67dd-xzjdp\" (UID: \"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996593 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gw6\" (UniqueName: \"kubernetes.io/projected/3c1a3995-a406-4562-b62e-682408dd877e-kube-api-access-f7gw6\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996651 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-csi-data-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996688 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82540b00-caaa-4453-876e-aa214f6f7d34-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996713 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-proxy-tls\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996745 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psb4b\" (UniqueName: \"kubernetes.io/projected/3d92f349-809a-4fdc-9006-420e736d856d-kube-api-access-psb4b\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.996765 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c11c529-6073-4d85-b790-8dc781625217-signing-cabundle\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.997417 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d92f349-809a-4fdc-9006-420e736d856d-apiservice-cert\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.997554 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3d92f349-809a-4fdc-9006-420e736d856d-tmpfs\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.997582 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4be28b87-7df0-4dc6-8e22-de8348686347-certs\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.997604 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77x7\" (UniqueName: \"kubernetes.io/projected/6e9532cf-efc9-413a-b8f0-a9186a4c74c8-kube-api-access-s77x7\") pod \"package-server-manager-789f6589d5-84vwh\" (UID: \"6e9532cf-efc9-413a-b8f0-a9186a4c74c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.997889 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-registration-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.997922 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljmdf\" (UniqueName: \"kubernetes.io/projected/6c11c529-6073-4d85-b790-8dc781625217-kube-api-access-ljmdf\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998058 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h5w6\" (UniqueName: \"kubernetes.io/projected/2acf05e1-f152-4432-b0b2-a44b242d0308-kube-api-access-4h5w6\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998081 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65d4a91-5adb-4ab2-af99-757daad56bd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.992078 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-config-volume\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998206 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c11c529-6073-4d85-b790-8dc781625217-signing-key\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998227 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e9532cf-efc9-413a-b8f0-a9186a4c74c8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-84vwh\" (UID: \"6e9532cf-efc9-413a-b8f0-a9186a4c74c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998377 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62422e21-39c6-4772-8f59-33be3d16c368-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dljhp\" (UID: \"62422e21-39c6-4772-8f59-33be3d16c368\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998408 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998736 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-mountpoint-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998768 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998802 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gbdl\" (UniqueName: \"kubernetes.io/projected/cba13ebb-13fa-4059-8d32-295a92eb0cdb-kube-api-access-5gbdl\") pod \"migrator-59844c95c7-nq6b6\" (UID: \"cba13ebb-13fa-4059-8d32-295a92eb0cdb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998826 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e89761d-ea2a-4711-94fa-54bce852e7e3-config\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998843 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-plugins-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998867 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-metrics-tls\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998884 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vlb\" (UniqueName: \"kubernetes.io/projected/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-kube-api-access-n8vlb\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.998903 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.999778 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:12 crc kubenswrapper[4672]: I0930 12:24:12.999904 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-csi-data-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.000456 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82540b00-caaa-4453-876e-aa214f6f7d34-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.000620 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-socket-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.000923 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7stp\" (UniqueName: \"kubernetes.io/projected/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-kube-api-access-k7stp\") pod \"console-f9d7485db-x8stp\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.001336 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/82540b00-caaa-4453-876e-aa214f6f7d34-images\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.003093 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4be28b87-7df0-4dc6-8e22-de8348686347-node-bootstrap-token\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:12.997641 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1-cert\") pod \"ingress-canary-dwqnn\" (UID: \"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1\") " pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.003250 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040f9564-cad2-48c5-b4d6-69b1add18a78-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.003302 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.003972 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.004052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-registration-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.004557 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f58782-5e5b-41f9-be22-0d3f1ab4423d-trusted-ca\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.004931 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c11c529-6073-4d85-b790-8dc781625217-signing-cabundle\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.005735 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.005859 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-plugins-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.006149 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65d4a91-5adb-4ab2-af99-757daad56bd4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.006593 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.006787 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.007814 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.007863 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/076cdbe0-669c-4655-b2d1-8967456a6e62-mountpoint-dir\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.008547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e89761d-ea2a-4711-94fa-54bce852e7e3-config\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.011550 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d92f349-809a-4fdc-9006-420e736d856d-apiservice-cert\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.012844 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-proxy-tls\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.013138 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e9532cf-efc9-413a-b8f0-a9186a4c74c8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-84vwh\" (UID: \"6e9532cf-efc9-413a-b8f0-a9186a4c74c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.015666 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62422e21-39c6-4772-8f59-33be3d16c368-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dljhp\" (UID: \"62422e21-39c6-4772-8f59-33be3d16c368\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.016373 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4be28b87-7df0-4dc6-8e22-de8348686347-certs\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.017010 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c11c529-6073-4d85-b790-8dc781625217-signing-key\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.018904 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-srv-cert\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.023836 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.032971 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txjmp\" (UniqueName: \"kubernetes.io/projected/b9bd590e-1ef9-47f4-874d-c5324309ebfd-kube-api-access-txjmp\") pod \"etcd-operator-b45778765-bqdd5\" (UID: \"b9bd590e-1ef9-47f4-874d-c5324309ebfd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.041150 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-bound-sa-token\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.043377 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.046144 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3d92f349-809a-4fdc-9006-420e736d856d-tmpfs\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.049959 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2acf05e1-f152-4432-b0b2-a44b242d0308-secret-volume\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.056238 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-metrics-tls\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.059352 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.089123 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65d4a91-5adb-4ab2-af99-757daad56bd4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m85j8\" (UID: \"f65d4a91-5adb-4ab2-af99-757daad56bd4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.101944 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.102438 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.602422569 +0000 UTC m=+144.871660215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.102530 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4mf\" (UniqueName: \"kubernetes.io/projected/f4d96698-0412-4923-9a10-03b174e0ca6c-kube-api-access-9c4mf\") pod \"marketplace-operator-79b997595-svmxm\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.120183 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dlbv7"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.133284 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc65b\" (UniqueName: \"kubernetes.io/projected/4be28b87-7df0-4dc6-8e22-de8348686347-kube-api-access-sc65b\") pod \"machine-config-server-wzf9f\" (UID: \"4be28b87-7df0-4dc6-8e22-de8348686347\") " pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.143791 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqz8\" (UniqueName: \"kubernetes.io/projected/62422e21-39c6-4772-8f59-33be3d16c368-kube-api-access-6hqz8\") pod \"control-plane-machine-set-operator-78cbb6b69f-dljhp\" (UID: \"62422e21-39c6-4772-8f59-33be3d16c368\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.159495 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.162590 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdnn7\" (UniqueName: \"kubernetes.io/projected/82f58782-5e5b-41f9-be22-0d3f1ab4423d-kube-api-access-cdnn7\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.164845 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.181973 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.186541 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwjzj\" (UniqueName: \"kubernetes.io/projected/076cdbe0-669c-4655-b2d1-8967456a6e62-kube-api-access-kwjzj\") pod \"csi-hostpathplugin-r4gx9\" (UID: \"076cdbe0-669c-4655-b2d1-8967456a6e62\") " pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:13 crc kubenswrapper[4672]: W0930 12:24:13.197386 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a57383c_8b99_49a9_adb7_caccbd6b3c12.slice/crio-72b78db54e2883f8a8bf61504297bedd9fb706de541dcd9049fdf496878d4710 WatchSource:0}: Error finding container 72b78db54e2883f8a8bf61504297bedd9fb706de541dcd9049fdf496878d4710: Status 404 returned error can't find the container with id 72b78db54e2883f8a8bf61504297bedd9fb706de541dcd9049fdf496878d4710 Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.206518 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.206936 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.706921116 +0000 UTC m=+144.976158762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.207558 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zl2b\" (UniqueName: \"kubernetes.io/projected/6592c35e-f6fa-4cc5-b099-4cf13dfaaf76-kube-api-access-8zl2b\") pod \"olm-operator-6b444d44fb-5n4bs\" (UID: \"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.209840 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.230088 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqddq\" (UniqueName: \"kubernetes.io/projected/3e89761d-ea2a-4711-94fa-54bce852e7e3-kube-api-access-lqddq\") pod \"service-ca-operator-777779d784-9zz4h\" (UID: \"3e89761d-ea2a-4711-94fa-54bce852e7e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.236762 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wzf9f" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.241742 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlms7\" (UniqueName: \"kubernetes.io/projected/2f1f2c64-e609-4c7a-adf5-ec317fe052c9-kube-api-access-qlms7\") pod \"machine-config-controller-84d6567774-lnjdx\" (UID: \"2f1f2c64-e609-4c7a-adf5-ec317fe052c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.242425 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bjdbw"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.249059 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" event={"ID":"41a42386-34f1-47ec-85bf-4c81bd9228be","Type":"ContainerStarted","Data":"a3704d2fcd61bf11598dfb06803ef0c238ad90205b0ecc10c6dd135b8dbca143"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.250661 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" event={"ID":"c4246346-4680-4d7d-a64f-262c987067fd","Type":"ContainerStarted","Data":"d8d99b8b49d97dd767d8832561143c95050798015b998800794c864a236c9610"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.266174 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.266733 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vstdz\" (UniqueName: \"kubernetes.io/projected/29a5441c-2a0c-443f-a7ab-47f2abc81a1c-kube-api-access-vstdz\") pod \"dns-default-z2tt9\" (UID: \"29a5441c-2a0c-443f-a7ab-47f2abc81a1c\") " pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.276850 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vs8kw"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.279561 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-52jss" event={"ID":"b11da5bc-b91d-4a8e-8839-da0f3989618e","Type":"ContainerStarted","Data":"d71105f48951af21d6848547755379f3870a9c33516be1939aff701bc1ff8442"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.291336 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a5bd27-f377-4531-8c55-29e2f2c4ccc8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cf7gg\" (UID: \"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.298963 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.306758 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" event={"ID":"6a57383c-8b99-49a9-adb7-caccbd6b3c12","Type":"ContainerStarted","Data":"72b78db54e2883f8a8bf61504297bedd9fb706de541dcd9049fdf496878d4710"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.307346 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.307584 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.807565181 +0000 UTC m=+145.076802827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.308020 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.308457 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.808442997 +0000 UTC m=+145.077680643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.317668 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.325133 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" event={"ID":"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a","Type":"ContainerStarted","Data":"825ccda2499fe911d2811644968861700c96eded5b4c6729a31a939e9b45d876"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.325316 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f58782-5e5b-41f9-be22-0d3f1ab4423d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6t7qg\" (UID: \"82f58782-5e5b-41f9-be22-0d3f1ab4423d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.329817 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dlbv7" event={"ID":"7fb8bbc0-c63a-4ab5-b454-13682563fe31","Type":"ContainerStarted","Data":"f9daaebee909a8ff23c91a08dd47515b356023abea3a0484e78a0283272e7022"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.331611 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" event={"ID":"04847bd1-7d49-41f9-be74-08033dd1212e","Type":"ContainerStarted","Data":"e377114d3ee8f6e41430e381d9f282364530295745ac9693585f54fb2ef618f9"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.354139 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/040f9564-cad2-48c5-b4d6-69b1add18a78-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dqtc\" (UID: \"040f9564-cad2-48c5-b4d6-69b1add18a78\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:13 crc kubenswrapper[4672]: W0930 12:24:13.354278 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod551a978e_b9ac_46c6_ad40_b2ed5b6121da.slice/crio-df3033a182f759f6b82d7597a5370ef0747ee54a237c7c12fdddacce281e2d33 WatchSource:0}: Error finding container df3033a182f759f6b82d7597a5370ef0747ee54a237c7c12fdddacce281e2d33: Status 404 returned error can't find the container with id df3033a182f759f6b82d7597a5370ef0747ee54a237c7c12fdddacce281e2d33 Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.360428 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fdnf\" (UniqueName: \"kubernetes.io/projected/82540b00-caaa-4453-876e-aa214f6f7d34-kube-api-access-9fdnf\") pod \"machine-config-operator-74547568cd-b6x4v\" (UID: \"82540b00-caaa-4453-876e-aa214f6f7d34\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.364585 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" event={"ID":"20530f49-28c3-4983-8301-cb4275d5a129","Type":"ContainerStarted","Data":"fb28ecfc96362e2bdd8f0bc9f5c6775b93f1094a1141daa52041338f23b9c961"} Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.366515 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.371032 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.374317 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.375169 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.381036 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.396301 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4d4\" (UniqueName: \"kubernetes.io/projected/9869a0d1-03fd-4b72-9519-16c0ef0c4bd9-kube-api-access-df4d4\") pod \"multus-admission-controller-857f4d67dd-xzjdp\" (UID: \"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.406881 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.408665 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.410810 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:13.910789731 +0000 UTC m=+145.180027377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.414555 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.417052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvkp\" (UniqueName: \"kubernetes.io/projected/f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1-kube-api-access-nsvkp\") pod \"ingress-canary-dwqnn\" (UID: \"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1\") " pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.420846 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.421467 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gw6\" (UniqueName: \"kubernetes.io/projected/3c1a3995-a406-4562-b62e-682408dd877e-kube-api-access-f7gw6\") pod \"catalog-operator-68c6474976-mp8wz\" (UID: \"3c1a3995-a406-4562-b62e-682408dd877e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.441336 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.449568 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljmdf\" (UniqueName: \"kubernetes.io/projected/6c11c529-6073-4d85-b790-8dc781625217-kube-api-access-ljmdf\") pod \"service-ca-9c57cc56f-nb7b9\" (UID: \"6c11c529-6073-4d85-b790-8dc781625217\") " pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.457506 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.500500 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.509360 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dwqnn" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.510550 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.510930 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.010914282 +0000 UTC m=+145.280151928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.517531 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.519189 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77x7\" (UniqueName: \"kubernetes.io/projected/6e9532cf-efc9-413a-b8f0-a9186a4c74c8-kube-api-access-s77x7\") pod \"package-server-manager-789f6589d5-84vwh\" (UID: \"6e9532cf-efc9-413a-b8f0-a9186a4c74c8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.541900 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.563631 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psb4b\" (UniqueName: \"kubernetes.io/projected/3d92f349-809a-4fdc-9006-420e736d856d-kube-api-access-psb4b\") pod \"packageserver-d55dfcdfc-5wh44\" (UID: \"3d92f349-809a-4fdc-9006-420e736d856d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.564410 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gbdl\" (UniqueName: \"kubernetes.io/projected/cba13ebb-13fa-4059-8d32-295a92eb0cdb-kube-api-access-5gbdl\") pod \"migrator-59844c95c7-nq6b6\" (UID: \"cba13ebb-13fa-4059-8d32-295a92eb0cdb\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.586929 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h5w6\" (UniqueName: \"kubernetes.io/projected/2acf05e1-f152-4432-b0b2-a44b242d0308-kube-api-access-4h5w6\") pod \"collect-profiles-29320575-tsgk2\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.592980 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.595138 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vlb\" (UniqueName: \"kubernetes.io/projected/6ed45e7d-66e8-4755-a92e-59140d8b6ac8-kube-api-access-n8vlb\") pod \"cluster-image-registry-operator-dc59b4c8b-jwlqh\" (UID: \"6ed45e7d-66e8-4755-a92e-59140d8b6ac8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.613329 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.613692 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.113678328 +0000 UTC m=+145.382915974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.658609 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.685066 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bqdd5"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.689954 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.696814 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.714800 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.715221 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.215202479 +0000 UTC m=+145.484440125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.731925 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.755356 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.786231 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.815672 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.816150 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.316135273 +0000 UTC m=+145.585372919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.818727 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.826890 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.916929 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:13 crc kubenswrapper[4672]: E0930 12:24:13.917730 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.417717665 +0000 UTC m=+145.686955311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.946955 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svmxm"] Sep 30 12:24:13 crc kubenswrapper[4672]: I0930 12:24:13.964431 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-r4gx9"] Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.016782 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8"] Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.022011 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.022609 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.522591713 +0000 UTC m=+145.791829359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.080568 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" podStartSLOduration=123.080549706 podStartE2EDuration="2m3.080549706s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:14.078397214 +0000 UTC m=+145.347634860" watchObservedRunningTime="2025-09-30 12:24:14.080549706 +0000 UTC m=+145.349787352" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.125018 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.125374 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.625345789 +0000 UTC m=+145.894583435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.133117 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ppzw6" podStartSLOduration=124.133099223 podStartE2EDuration="2m4.133099223s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:14.132221208 +0000 UTC m=+145.401458854" watchObservedRunningTime="2025-09-30 12:24:14.133099223 +0000 UTC m=+145.402336859" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.225681 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.226068 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.726052057 +0000 UTC m=+145.995289703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: W0930 12:24:14.314952 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d96698_0412_4923_9a10_03b174e0ca6c.slice/crio-4dafabaa7c11c84639fe41e7e6b9bb2b6e20a0ad04aad39f51af8f0390c9e2e3 WatchSource:0}: Error finding container 4dafabaa7c11c84639fe41e7e6b9bb2b6e20a0ad04aad39f51af8f0390c9e2e3: Status 404 returned error can't find the container with id 4dafabaa7c11c84639fe41e7e6b9bb2b6e20a0ad04aad39f51af8f0390c9e2e3 Sep 30 12:24:14 crc kubenswrapper[4672]: W0930 12:24:14.317043 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65d4a91_5adb_4ab2_af99_757daad56bd4.slice/crio-928707d0bbf09c012f19fce25e3a3070b65bc6f49939980c35174094af263072 WatchSource:0}: Error finding container 928707d0bbf09c012f19fce25e3a3070b65bc6f49939980c35174094af263072: Status 404 returned error can't find the container with id 928707d0bbf09c012f19fce25e3a3070b65bc6f49939980c35174094af263072 Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.326589 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.326968 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.82695544 +0000 UTC m=+146.096193086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.405697 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-x8stp"] Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.410981 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rj74q"] Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.429482 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.435732 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wzf9f" event={"ID":"4be28b87-7df0-4dc6-8e22-de8348686347","Type":"ContainerStarted","Data":"3cba9fa12255abf9c6ce442d936eb430fa6828571d960547f3a43b3524092162"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.435837 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wzf9f" event={"ID":"4be28b87-7df0-4dc6-8e22-de8348686347","Type":"ContainerStarted","Data":"568ff2cb8f0844b129a436d098ec10a6fbe4feb27afa02f6d4ff9bc3359e8c8f"} Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.442518 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:14.942488885 +0000 UTC m=+146.211726531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.461343 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" event={"ID":"62422e21-39c6-4772-8f59-33be3d16c368","Type":"ContainerStarted","Data":"22f36b922b9bec699040048355830d8026f44125e017e9f93c85155e1fd4d204"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.471068 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" event={"ID":"f65d4a91-5adb-4ab2-af99-757daad56bd4","Type":"ContainerStarted","Data":"928707d0bbf09c012f19fce25e3a3070b65bc6f49939980c35174094af263072"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.472868 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" event={"ID":"076cdbe0-669c-4655-b2d1-8967456a6e62","Type":"ContainerStarted","Data":"0cb31c54f31826fd6b9827a58babd5e395630be499295903574da09c4493b1fd"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.487077 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg"] Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.495075 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" event={"ID":"6a57383c-8b99-49a9-adb7-caccbd6b3c12","Type":"ContainerStarted","Data":"cd887776c967a5a662f2164d88847f1900913a325faf113f41afd8cd6f5fdd96"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.537988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.539381 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.039369142 +0000 UTC m=+146.308606788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.554191 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-52jss" event={"ID":"b11da5bc-b91d-4a8e-8839-da0f3989618e","Type":"ContainerStarted","Data":"63757a0ac6983408b5e68132053a0e1fc5282e94a9d14cc202d1cab3955678a2"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.584280 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" event={"ID":"551a978e-b9ac-46c6-ad40-b2ed5b6121da","Type":"ContainerStarted","Data":"df3033a182f759f6b82d7597a5370ef0747ee54a237c7c12fdddacce281e2d33"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.585655 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" event={"ID":"b1bca969-46a3-462a-b129-508c3f6752a0","Type":"ContainerStarted","Data":"b6832897c8a7af8236e08b0aa05fb69d14b70cdaef1f50d23e803e7d6acc150d"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.586531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" event={"ID":"41a42386-34f1-47ec-85bf-4c81bd9228be","Type":"ContainerStarted","Data":"8e68146f1155a1036932703483ded2536f820c563a28b1fdb0f9fed4c55bf17e"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.592765 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" event={"ID":"f4d96698-0412-4923-9a10-03b174e0ca6c","Type":"ContainerStarted","Data":"4dafabaa7c11c84639fe41e7e6b9bb2b6e20a0ad04aad39f51af8f0390c9e2e3"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.624116 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" event={"ID":"b9bd590e-1ef9-47f4-874d-c5324309ebfd","Type":"ContainerStarted","Data":"c0e1c3a99f79af91b123fd7e5f38d79b22f5e89d1916937e16b9d9a43ee704a1"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.652765 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.653085 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.153070384 +0000 UTC m=+146.422308030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.657631 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dlbv7" event={"ID":"7fb8bbc0-c63a-4ab5-b454-13682563fe31","Type":"ContainerStarted","Data":"f6c2917e8ee905225ec0b1018e34391edbd4e63d8429f8d31acbecf0f693af79"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.657995 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dlbv7" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.714784 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" event={"ID":"04847bd1-7d49-41f9-be74-08033dd1212e","Type":"ContainerStarted","Data":"8dea8e6a353d73124b5b5019ab24277f2f667bdf81d793521cf294b2905a345a"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.718373 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-dlbv7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.718429 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dlbv7" podUID="7fb8bbc0-c63a-4ab5-b454-13682563fe31" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.728655 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" podStartSLOduration=123.728641676 podStartE2EDuration="2m3.728641676s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:14.726738301 +0000 UTC m=+145.995975947" watchObservedRunningTime="2025-09-30 12:24:14.728641676 +0000 UTC m=+145.997879322" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.735379 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" event={"ID":"87529aa1-f650-43ab-a3c2-a41f444c71d1","Type":"ContainerStarted","Data":"907cbba0af9e90344c9879a4e4635dfdfff82b54f2eb967e51d30aa1535ad0cc"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.742584 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" event={"ID":"4312b92a-c221-4cf3-948b-096a90dd7846","Type":"ContainerStarted","Data":"d2180508a22a3ed1a263acc779510e4146e8726fcc1bd29cda74d04ce1e8d4b6"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.742646 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" event={"ID":"4312b92a-c221-4cf3-948b-096a90dd7846","Type":"ContainerStarted","Data":"9235a046da8148ae4ef62dcdb30cc91e72c2f7ae9784237b2a7572fe87dd217e"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.743804 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.755050 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.756369 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.256357526 +0000 UTC m=+146.525595172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.757755 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.770579 4672 patch_prober.go:28] interesting pod/console-operator-58897d9998-bjdbw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.770640 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" podUID="4312b92a-c221-4cf3-948b-096a90dd7846" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.773104 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" podStartSLOduration=124.773095179 podStartE2EDuration="2m4.773095179s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:14.772355347 +0000 UTC m=+146.041592993" watchObservedRunningTime="2025-09-30 12:24:14.773095179 +0000 UTC m=+146.042332825" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.778791 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" event={"ID":"afcb62e7-0ff7-4af1-bebc-f32bfc50d94a","Type":"ContainerStarted","Data":"4e9b2e2ad7dba7f4942823fe788ebaee373ab44d131ddb2a490ba4e379aa4650"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.795976 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" event={"ID":"c4246346-4680-4d7d-a64f-262c987067fd","Type":"ContainerStarted","Data":"3b5a9ec5c95dd1198c5d214c6c94ac0584b1d536bb01ca0380aa9063e9f6a3e0"} Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.827032 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qrbqf" podStartSLOduration=123.827014215 podStartE2EDuration="2m3.827014215s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:14.825369288 +0000 UTC m=+146.094606944" watchObservedRunningTime="2025-09-30 12:24:14.827014215 +0000 UTC m=+146.096251861" Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.856530 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.857041 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.357025612 +0000 UTC m=+146.626263258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.857220 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.858487 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.358478894 +0000 UTC m=+146.627716540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:14 crc kubenswrapper[4672]: I0930 12:24:14.959378 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:14 crc kubenswrapper[4672]: E0930 12:24:14.959831 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.459801579 +0000 UTC m=+146.729039225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.062768 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.063610 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.563590125 +0000 UTC m=+146.832827771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.168469 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.168881 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.668859744 +0000 UTC m=+146.938097390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.169157 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.169579 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.669566914 +0000 UTC m=+146.938804560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.212422 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" podStartSLOduration=125.212407501 podStartE2EDuration="2m5.212407501s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.2102802 +0000 UTC m=+146.479517846" watchObservedRunningTime="2025-09-30 12:24:15.212407501 +0000 UTC m=+146.481645137" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.272846 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.273088 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.773074713 +0000 UTC m=+147.042312359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.279996 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" podStartSLOduration=125.279979572 podStartE2EDuration="2m5.279979572s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.241349107 +0000 UTC m=+146.510586753" watchObservedRunningTime="2025-09-30 12:24:15.279979572 +0000 UTC m=+146.549217218" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.319612 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" podStartSLOduration=125.319595945 podStartE2EDuration="2m5.319595945s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.283121823 +0000 UTC m=+146.552359469" watchObservedRunningTime="2025-09-30 12:24:15.319595945 +0000 UTC m=+146.588833591" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.355371 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:15 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:15 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:15 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.355487 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.376884 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.377335 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.877314032 +0000 UTC m=+147.146551678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.376841 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdshz" podStartSLOduration=125.376812867 podStartE2EDuration="2m5.376812867s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.320545453 +0000 UTC m=+146.589783099" watchObservedRunningTime="2025-09-30 12:24:15.376812867 +0000 UTC m=+146.646050513" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.377900 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wzf9f" podStartSLOduration=5.377884518 podStartE2EDuration="5.377884518s" podCreationTimestamp="2025-09-30 12:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.375446568 +0000 UTC m=+146.644684234" watchObservedRunningTime="2025-09-30 12:24:15.377884518 +0000 UTC m=+146.647122164" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.452490 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dlbv7" podStartSLOduration=125.452473841 podStartE2EDuration="2m5.452473841s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.443743439 +0000 UTC m=+146.712981095" watchObservedRunningTime="2025-09-30 12:24:15.452473841 +0000 UTC m=+146.721711487" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.478621 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.479034 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:15.979008978 +0000 UTC m=+147.248246624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.503705 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-52jss" podStartSLOduration=125.50368821 podStartE2EDuration="2m5.50368821s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.502499346 +0000 UTC m=+146.771736992" watchObservedRunningTime="2025-09-30 12:24:15.50368821 +0000 UTC m=+146.772925856" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.582170 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.582548 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.082535356 +0000 UTC m=+147.351773002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.686146 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.686584 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.186569139 +0000 UTC m=+147.455806785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.788990 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.789768 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.289755707 +0000 UTC m=+147.558993353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.800092 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:15 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:15 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:15 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.800373 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.825468 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gncd7" event={"ID":"04847bd1-7d49-41f9-be74-08033dd1212e","Type":"ContainerStarted","Data":"be0bfa54bc8e509acb70d1e45cc1257057fb15a74e1183e008e883aa00a16d88"} Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.828225 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" event={"ID":"87529aa1-f650-43ab-a3c2-a41f444c71d1","Type":"ContainerStarted","Data":"a54ae17717b24492764ea9f74e97a2f2e4b0f3e5480235fd3e46cbeceea738be"} Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.841184 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x8stp" event={"ID":"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04","Type":"ContainerStarted","Data":"989fb02e2f74f13b0c46e80355643bca415644562e39e30c9bf47f4554bfeb45"} Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.857456 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" event={"ID":"c4246346-4680-4d7d-a64f-262c987067fd","Type":"ContainerStarted","Data":"f25f03a5f06a6c354f4e2e67bc68bdaf699f5d6a7553b49d876b1bad00621bee"} Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.867723 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" event={"ID":"551a978e-b9ac-46c6-ad40-b2ed5b6121da","Type":"ContainerStarted","Data":"53613a217d27e173b41b9b6f8cabf79cce9bfafd2cc0edc75a59c34679ab1350"} Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.868329 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.876327 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tlgqg" podStartSLOduration=124.876310106 podStartE2EDuration="2m4.876310106s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.873873836 +0000 UTC m=+147.143111472" watchObservedRunningTime="2025-09-30 12:24:15.876310106 +0000 UTC m=+147.145547752" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.884431 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" event={"ID":"b5d17b1a-138e-4832-a374-8cdd0c028821","Type":"ContainerStarted","Data":"69e78873f69059ac715f1c48660b0302ebbb722823a6337ad6b937e53268945a"} Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.898147 4672 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vs8kw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.898199 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" podUID="551a978e-b9ac-46c6-ad40-b2ed5b6121da" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.898641 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:15 crc kubenswrapper[4672]: E0930 12:24:15.900361 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.40034546 +0000 UTC m=+147.669583106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.904969 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" event={"ID":"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8","Type":"ContainerStarted","Data":"23afb6b5fe63fe2774a1ae5650634973ba5d8278981224c9082b3c217ce83cc9"} Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.911361 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-dlbv7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.911407 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dlbv7" podUID="7fb8bbc0-c63a-4ab5-b454-13682563fe31" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.914377 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" podStartSLOduration=125.914363015 podStartE2EDuration="2m5.914363015s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.912353827 +0000 UTC m=+147.181591483" watchObservedRunningTime="2025-09-30 12:24:15.914363015 +0000 UTC m=+147.183600661" Sep 30 12:24:15 crc kubenswrapper[4672]: I0930 12:24:15.973386 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vr2sj" podStartSLOduration=125.973358318 podStartE2EDuration="2m5.973358318s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:15.970623769 +0000 UTC m=+147.239861415" watchObservedRunningTime="2025-09-30 12:24:15.973358318 +0000 UTC m=+147.242595964" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.001084 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.003955 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.50393574 +0000 UTC m=+147.773173386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.046438 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xzjdp"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.106385 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.106548 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.606514302 +0000 UTC m=+147.875751948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.106868 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.121876 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.621827424 +0000 UTC m=+147.891065070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.182938 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.209116 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.209489 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.709473714 +0000 UTC m=+147.978711360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.267378 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.288739 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.310757 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.311085 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.811073867 +0000 UTC m=+148.080311513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.324116 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.355437 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dwqnn"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.365580 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.376325 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.376364 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.391879 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.395323 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.395402 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.409704 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.412083 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.412480 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:16.912463614 +0000 UTC m=+148.181701260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.427929 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bjdbw" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.514324 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.515833 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.015820538 +0000 UTC m=+148.285058184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: W0930 12:24:16.553680 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1f2c64_e609_4c7a_adf5_ec317fe052c9.slice/crio-6808a227006d0cbbb5f3df0a9e47f6b8457d1e4dcc1dc67c8f0ddfc4e0fafebb WatchSource:0}: Error finding container 6808a227006d0cbbb5f3df0a9e47f6b8457d1e4dcc1dc67c8f0ddfc4e0fafebb: Status 404 returned error can't find the container with id 6808a227006d0cbbb5f3df0a9e47f6b8457d1e4dcc1dc67c8f0ddfc4e0fafebb Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.600301 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.615230 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.615414 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.115385812 +0000 UTC m=+148.384623468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.615576 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.615973 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.115961669 +0000 UTC m=+148.385199315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.629344 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nb7b9"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.723438 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.724056 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.224031748 +0000 UTC m=+148.493269394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.725425 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z2tt9"] Sep 30 12:24:16 crc kubenswrapper[4672]: W0930 12:24:16.742959 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040f9564_cad2_48c5_b4d6_69b1add18a78.slice/crio-80699aef4d33abdde317a1a17f76324be74eae2c479f1527b45f731d6139c7ae WatchSource:0}: Error finding container 80699aef4d33abdde317a1a17f76324be74eae2c479f1527b45f731d6139c7ae: Status 404 returned error can't find the container with id 80699aef4d33abdde317a1a17f76324be74eae2c479f1527b45f731d6139c7ae Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.762513 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:16 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:16 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:16 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.762573 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.783741 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.791106 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.825091 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.825453 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.325441446 +0000 UTC m=+148.594679092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.829029 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.846774 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.854373 4672 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bj7dq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]log ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]etcd ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/max-in-flight-filter ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 12:24:16 crc kubenswrapper[4672]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 12:24:16 crc kubenswrapper[4672]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 12:24:16 crc kubenswrapper[4672]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 12:24:16 crc kubenswrapper[4672]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 12:24:16 crc kubenswrapper[4672]: livez check failed Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.854734 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" podUID="afcb62e7-0ff7-4af1-bebc-f32bfc50d94a" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.874793 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh"] Sep 30 12:24:16 crc kubenswrapper[4672]: I0930 12:24:16.926905 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:16 crc kubenswrapper[4672]: E0930 12:24:16.927396 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.427380799 +0000 UTC m=+148.696618435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.027448 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" event={"ID":"f65d4a91-5adb-4ab2-af99-757daad56bd4","Type":"ContainerStarted","Data":"1712b06c3abb248f9337f741e9c88c59f9bd1717f2cbe59557c48374a74cb440"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.030888 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.031918 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.531869185 +0000 UTC m=+148.801106831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.040130 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" event={"ID":"62422e21-39c6-4772-8f59-33be3d16c368","Type":"ContainerStarted","Data":"73943e95b27cfa522ffd91a57d461c374713e58f5a48f2440fd0c25f612ba195"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.050645 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" event={"ID":"076cdbe0-669c-4655-b2d1-8967456a6e62","Type":"ContainerStarted","Data":"fd1f1918993194a63ba3c4124fd85f548e7ff137f58adcc9492ea53fc47ef74b"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.084843 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" event={"ID":"2f1f2c64-e609-4c7a-adf5-ec317fe052c9","Type":"ContainerStarted","Data":"6808a227006d0cbbb5f3df0a9e47f6b8457d1e4dcc1dc67c8f0ddfc4e0fafebb"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.100313 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m85j8" podStartSLOduration=127.10028716 podStartE2EDuration="2m7.10028716s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.096548662 +0000 UTC m=+148.365786308" watchObservedRunningTime="2025-09-30 12:24:17.10028716 +0000 UTC m=+148.369524806" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.104601 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dwqnn" event={"ID":"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1","Type":"ContainerStarted","Data":"ce50f5cef86b95bb6f8d71e8bcf21b0d8e5f3d56db7533a484efd1ba1ccd7bf1"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.116036 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x8stp" event={"ID":"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04","Type":"ContainerStarted","Data":"219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.132603 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.140023 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.639992807 +0000 UTC m=+148.909230453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.144986 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" event={"ID":"82f58782-5e5b-41f9-be22-0d3f1ab4423d","Type":"ContainerStarted","Data":"263538d0a5ce7be6c804b2de9f40fa1edef403546d4ccd2283d56d39db7bf6a6"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.145052 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" event={"ID":"82f58782-5e5b-41f9-be22-0d3f1ab4423d","Type":"ContainerStarted","Data":"6419775bcca7977b1347b6a9c38903669f78a3288a8073d6f0f1194941f67923"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.151210 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" event={"ID":"040f9564-cad2-48c5-b4d6-69b1add18a78","Type":"ContainerStarted","Data":"80699aef4d33abdde317a1a17f76324be74eae2c479f1527b45f731d6139c7ae"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.159015 4672 generic.go:334] "Generic (PLEG): container finished" podID="b5d17b1a-138e-4832-a374-8cdd0c028821" containerID="e1c8d1c8d72b66a523c1937d6636996560ba9cc3b0cabfbabf1e5a77e872674c" exitCode=0 Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.159095 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" event={"ID":"b5d17b1a-138e-4832-a374-8cdd0c028821","Type":"ContainerDied","Data":"e1c8d1c8d72b66a523c1937d6636996560ba9cc3b0cabfbabf1e5a77e872674c"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.163970 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" event={"ID":"e3a5bd27-f377-4531-8c55-29e2f2c4ccc8","Type":"ContainerStarted","Data":"f875008e9112ba6995eccc5c16947142fd686e27cb5542511480ef77f5c2a5f5"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.183901 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-x8stp" podStartSLOduration=127.183887304 podStartE2EDuration="2m7.183887304s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.176453919 +0000 UTC m=+148.445691575" watchObservedRunningTime="2025-09-30 12:24:17.183887304 +0000 UTC m=+148.453124950" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.189257 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dljhp" podStartSLOduration=126.189226308 podStartE2EDuration="2m6.189226308s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.139464041 +0000 UTC m=+148.408701687" watchObservedRunningTime="2025-09-30 12:24:17.189226308 +0000 UTC m=+148.458463954" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.201628 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" event={"ID":"6c11c529-6073-4d85-b790-8dc781625217","Type":"ContainerStarted","Data":"66b8e08bd15e035931bfe1f7d595bb48a1aa9fd718cb778087d0fdfad3357e2e"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.237518 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.244368 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" event={"ID":"3c1a3995-a406-4562-b62e-682408dd877e","Type":"ContainerStarted","Data":"cf5bc7861fdd5ebb94af2ac5274c483c9fccb8c96e1552be936d71c4f4d06eb9"} Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.247964 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.747931683 +0000 UTC m=+149.017169329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.262668 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" event={"ID":"41a42386-34f1-47ec-85bf-4c81bd9228be","Type":"ContainerStarted","Data":"f04eb58f0809b6ec0191b66894ed74305f300ddcc3cefbc8c7768f4965b5ff81"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.283207 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" event={"ID":"b1bca969-46a3-462a-b129-508c3f6752a0","Type":"ContainerStarted","Data":"0cbc12fdf12c10d8fa2804925673edbce71a215570115d59354abb5713fa1793"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.285329 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" event={"ID":"2acf05e1-f152-4432-b0b2-a44b242d0308","Type":"ContainerStarted","Data":"162380361df785c6cdd425bd9caca58689c46083f9626f7fb29a7dcce520c37e"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.290136 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7l9kk" podStartSLOduration=127.290122151 podStartE2EDuration="2m7.290122151s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.289809632 +0000 UTC m=+148.559047278" watchObservedRunningTime="2025-09-30 12:24:17.290122151 +0000 UTC m=+148.559359797" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.290565 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" event={"ID":"b9bd590e-1ef9-47f4-874d-c5324309ebfd","Type":"ContainerStarted","Data":"f2f9aa195b147d93f24a850ece7ebf7b90b9d77ba56d0115e7d3cd912da9a7f0"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.290983 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cf7gg" podStartSLOduration=126.290977325 podStartE2EDuration="2m6.290977325s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.243281238 +0000 UTC m=+148.512518884" watchObservedRunningTime="2025-09-30 12:24:17.290977325 +0000 UTC m=+148.560214971" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.295053 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" event={"ID":"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9","Type":"ContainerStarted","Data":"ecd6390c249baaa84e0d708a4e25fe95085de9613b34d3872578d21e91092cf6"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.295084 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" event={"ID":"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9","Type":"ContainerStarted","Data":"1c3b457b0d5ef0fd5f135da70e11858b9cc267ef3a273e0aeda88c4047372191"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.304457 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" event={"ID":"82540b00-caaa-4453-876e-aa214f6f7d34","Type":"ContainerStarted","Data":"a90d3d353fe582b1f7795209c109de89d49f4c631aeda39b03118a510aac16aa"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.306443 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" event={"ID":"82540b00-caaa-4453-876e-aa214f6f7d34","Type":"ContainerStarted","Data":"ec41c3e3303774078f469e1499211319d68a1741c338a9b53f757540d53b6b73"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.309507 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" event={"ID":"3e89761d-ea2a-4711-94fa-54bce852e7e3","Type":"ContainerStarted","Data":"9002cc1ab4b83a05d5d92388e2c6a9e86ca5fa03a2a5ec6a01644da7a61939b8"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.324493 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" event={"ID":"f4d96698-0412-4923-9a10-03b174e0ca6c","Type":"ContainerStarted","Data":"23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.327741 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.338527 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.338687 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.838664702 +0000 UTC m=+149.107902348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.339879 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.340252 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.840241817 +0000 UTC m=+149.109479463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.338725 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wfvzx" podStartSLOduration=127.338711533 podStartE2EDuration="2m7.338711533s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.308757239 +0000 UTC m=+148.577994885" watchObservedRunningTime="2025-09-30 12:24:17.338711533 +0000 UTC m=+148.607949169" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.343836 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-bqdd5" podStartSLOduration=127.343817491 podStartE2EDuration="2m7.343817491s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.334469981 +0000 UTC m=+148.603707627" watchObservedRunningTime="2025-09-30 12:24:17.343817491 +0000 UTC m=+148.613055137" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.357949 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svmxm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.358011 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" podUID="f4d96698-0412-4923-9a10-03b174e0ca6c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.362874 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" podStartSLOduration=126.36284913 podStartE2EDuration="2m6.36284913s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.358482414 +0000 UTC m=+148.627720050" watchObservedRunningTime="2025-09-30 12:24:17.36284913 +0000 UTC m=+148.632086776" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.365064 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" event={"ID":"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76","Type":"ContainerStarted","Data":"393f8683eb8f500fa9508052d5e0bc7fd3311031f21a2c25f28bfd71cfe6aad7"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.367460 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.380660 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" event={"ID":"3d92f349-809a-4fdc-9006-420e736d856d","Type":"ContainerStarted","Data":"8bfc33cb3dac697793ae12d2c9502376e9a9f489a18845d1084ed017aa1bcf0e"} Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.395001 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zgl84" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.403796 4672 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5n4bs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.403853 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" podUID="6592c35e-f6fa-4cc5-b099-4cf13dfaaf76" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.431674 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" podStartSLOduration=126.431464441 podStartE2EDuration="2m6.431464441s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:17.391458216 +0000 UTC m=+148.660695862" watchObservedRunningTime="2025-09-30 12:24:17.431464441 +0000 UTC m=+148.700702077" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.439434 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.440876 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.441011 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.940993686 +0000 UTC m=+149.210231332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.441635 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.447947 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:17.947932366 +0000 UTC m=+149.217170012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.542947 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.543156 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.043134305 +0000 UTC m=+149.312371951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.546729 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.549773 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.049757246 +0000 UTC m=+149.318994892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.692567 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.693119 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.193040802 +0000 UTC m=+149.462278448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.693222 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.693709 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.193692651 +0000 UTC m=+149.462930297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.757023 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:17 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:17 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:17 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.757128 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.794522 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.795388 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.295370216 +0000 UTC m=+149.564607862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.897417 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.897861 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.397842945 +0000 UTC m=+149.667080591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:17 crc kubenswrapper[4672]: I0930 12:24:17.999383 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:17 crc kubenswrapper[4672]: E0930 12:24:17.999939 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.499908701 +0000 UTC m=+149.769146347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.103937 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.105700 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.605661694 +0000 UTC m=+149.874899340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.208718 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.209499 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.709481701 +0000 UTC m=+149.978719347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.310926 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.311365 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.811345642 +0000 UTC m=+150.080583288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.388753 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dwqnn" event={"ID":"f8a1fcab-b6bf-4f92-9c23-d8240afdc4a1","Type":"ContainerStarted","Data":"73d7b7a985c417b14af877335d6b9e0e338973d61a42df8869f4e8714deaf3be"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.405620 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" event={"ID":"6c11c529-6073-4d85-b790-8dc781625217","Type":"ContainerStarted","Data":"b614e6585188d765491051a4cd7eb7f3a7e55720618f364abdaab6f94d7d38b0"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.411467 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.411625 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.911600016 +0000 UTC m=+150.180837662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.411779 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.413035 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:18.913024387 +0000 UTC m=+150.182262033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.415837 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" event={"ID":"3c1a3995-a406-4562-b62e-682408dd877e","Type":"ContainerStarted","Data":"9e4b4d291dc2c5429f70ea548ffae98eb34a37d476e8abfbf098ef107dd0b875"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.416718 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.418299 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" event={"ID":"cba13ebb-13fa-4059-8d32-295a92eb0cdb","Type":"ContainerStarted","Data":"9b507acaacd806ec9fbecb66cce4541966f6244d49558e35de7304cfd8b21568"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.418322 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" event={"ID":"cba13ebb-13fa-4059-8d32-295a92eb0cdb","Type":"ContainerStarted","Data":"b692ee9bce83456aa44d03abae9d97e3ef82c5941f77c7b0a40b57c85329787b"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.418335 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" event={"ID":"cba13ebb-13fa-4059-8d32-295a92eb0cdb","Type":"ContainerStarted","Data":"634fe1c4b7877adf585974bc4db93dc60dc5084dec69ea16197b088b622b4057"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.419988 4672 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mp8wz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.420025 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" podUID="3c1a3995-a406-4562-b62e-682408dd877e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.421014 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" event={"ID":"82540b00-caaa-4453-876e-aa214f6f7d34","Type":"ContainerStarted","Data":"63922b1d2f9ac411b137c7645bac8536dc7d0ac40f2fdc0566c9f4bf094689b0"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.425224 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" event={"ID":"6592c35e-f6fa-4cc5-b099-4cf13dfaaf76","Type":"ContainerStarted","Data":"abc33765a08beaff5460d538a0e072682136dd4de78ed319206ac1525268dfe9"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.427023 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dwqnn" podStartSLOduration=8.427011281 podStartE2EDuration="8.427011281s" podCreationTimestamp="2025-09-30 12:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.409361791 +0000 UTC m=+149.678599447" watchObservedRunningTime="2025-09-30 12:24:18.427011281 +0000 UTC m=+149.696248927" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.427168 4672 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5n4bs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.427211 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" podUID="6592c35e-f6fa-4cc5-b099-4cf13dfaaf76" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.428585 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nb7b9" podStartSLOduration=127.428580166 podStartE2EDuration="2m7.428580166s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.426869257 +0000 UTC m=+149.696106903" watchObservedRunningTime="2025-09-30 12:24:18.428580166 +0000 UTC m=+149.697817812" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.454349 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" event={"ID":"6e9532cf-efc9-413a-b8f0-a9186a4c74c8","Type":"ContainerStarted","Data":"8aa2decd9f8e05a79db177046b6909c765649f2b2f41679901bad988d673b246"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.454632 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" event={"ID":"6e9532cf-efc9-413a-b8f0-a9186a4c74c8","Type":"ContainerStarted","Data":"f87b561bc190b9aec5641e892403f026e13c4fef35597e78a050bff526feacfa"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.454643 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" event={"ID":"6e9532cf-efc9-413a-b8f0-a9186a4c74c8","Type":"ContainerStarted","Data":"f8f4c84d0d6596e8cf22a9d35d16b8e4fbf7c5f6d27370221265a9f52c3882f4"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.455056 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.455586 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b6x4v" podStartSLOduration=127.455568705 podStartE2EDuration="2m7.455568705s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.454796033 +0000 UTC m=+149.724033689" watchObservedRunningTime="2025-09-30 12:24:18.455568705 +0000 UTC m=+149.724806351" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.463703 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" event={"ID":"2acf05e1-f152-4432-b0b2-a44b242d0308","Type":"ContainerStarted","Data":"2b5948e4f052ce096dd3fca3d3b8f8343dc8d91ed7163efd1bc1fbd512cd3647"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.470342 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" event={"ID":"2f1f2c64-e609-4c7a-adf5-ec317fe052c9","Type":"ContainerStarted","Data":"0fa0b25870dc25d82aa02adc46d367e9d9dfe22249603706ad8e3f3ea525e051"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.470378 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" event={"ID":"2f1f2c64-e609-4c7a-adf5-ec317fe052c9","Type":"ContainerStarted","Data":"19fde056870175f7b82932e242c85b82de1c9a9b4c58c8ebc6c791e2a0d521c7"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.473510 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" event={"ID":"3d92f349-809a-4fdc-9006-420e736d856d","Type":"ContainerStarted","Data":"57077c46be16e768c5283c9798481bac535cfd0dd92ed30960e4c2d15e92d154"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.474325 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.485758 4672 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5wh44 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.485806 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" event={"ID":"3e89761d-ea2a-4711-94fa-54bce852e7e3","Type":"ContainerStarted","Data":"2cef54542698ecd23b0e7e7d0c0690eb8f84d04fed8ec7e53682e96bf934a522"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.485817 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" podUID="3d92f349-809a-4fdc-9006-420e736d856d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.492817 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" event={"ID":"b5d17b1a-138e-4832-a374-8cdd0c028821","Type":"ContainerStarted","Data":"dca92ea951ed523da8827856ebb71803093cd2e5b4860532ee84499593d868c6"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.493438 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.504411 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z2tt9" event={"ID":"29a5441c-2a0c-443f-a7ab-47f2abc81a1c","Type":"ContainerStarted","Data":"02b9fabb054294702c217248380026fab4a99c6fbea26c2f5f7e4360c60c2a9f"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.504459 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z2tt9" event={"ID":"29a5441c-2a0c-443f-a7ab-47f2abc81a1c","Type":"ContainerStarted","Data":"0911cc8ac4a7c240b6269141f031559b2e2d2c3f14b76e79b3e0df101b0448e6"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.505580 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" event={"ID":"9869a0d1-03fd-4b72-9519-16c0ef0c4bd9","Type":"ContainerStarted","Data":"afe0ed7850befa0fff157f52c64cb517329ebb446e5fba6e489cd5f1d2b7851f"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.507300 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" event={"ID":"82f58782-5e5b-41f9-be22-0d3f1ab4423d","Type":"ContainerStarted","Data":"a23a4a1ec72e5001c64317d041b1a77bcd14542726ec9a007c48ab04081b66f1"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.512860 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.512960 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" podStartSLOduration=127.512942482 podStartE2EDuration="2m7.512942482s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.511584672 +0000 UTC m=+149.780822318" watchObservedRunningTime="2025-09-30 12:24:18.512942482 +0000 UTC m=+149.782180128" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.513141 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nq6b6" podStartSLOduration=127.513135427 podStartE2EDuration="2m7.513135427s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.48034046 +0000 UTC m=+149.749578116" watchObservedRunningTime="2025-09-30 12:24:18.513135427 +0000 UTC m=+149.782373073" Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.513314 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.013298102 +0000 UTC m=+150.282535748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.521411 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" event={"ID":"040f9564-cad2-48c5-b4d6-69b1add18a78","Type":"ContainerStarted","Data":"41d996856f30fd6dd73478302a65ed8841f527da6dfaaf412a231be3d8d1ab4f"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.543358 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" event={"ID":"6ed45e7d-66e8-4755-a92e-59140d8b6ac8","Type":"ContainerStarted","Data":"139e150aa9d787ab8ed56a24f662b02265e12f1e3368316902fc8b1cb1e09ea7"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.543396 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" event={"ID":"6ed45e7d-66e8-4755-a92e-59140d8b6ac8","Type":"ContainerStarted","Data":"d00aacf333c181bb672b40935cbc39b2d0fc17a1f6ed16b2728607732eea1c67"} Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.545483 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svmxm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.545574 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" podUID="f4d96698-0412-4923-9a10-03b174e0ca6c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.603763 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9zz4h" podStartSLOduration=127.603744733 podStartE2EDuration="2m7.603744733s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.551544046 +0000 UTC m=+149.820781692" watchObservedRunningTime="2025-09-30 12:24:18.603744733 +0000 UTC m=+149.872982379" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.605613 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" podStartSLOduration=127.605605767 podStartE2EDuration="2m7.605605767s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.602854917 +0000 UTC m=+149.872092563" watchObservedRunningTime="2025-09-30 12:24:18.605605767 +0000 UTC m=+149.874843413" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.618630 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.624370 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.124350978 +0000 UTC m=+150.393588624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.707561 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-xzjdp" podStartSLOduration=127.707531159 podStartE2EDuration="2m7.707531159s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.657965508 +0000 UTC m=+149.927203164" watchObservedRunningTime="2025-09-30 12:24:18.707531159 +0000 UTC m=+149.976768805" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.721035 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.721714 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.221691328 +0000 UTC m=+150.490928974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.746427 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" podStartSLOduration=128.746404991 podStartE2EDuration="2m8.746404991s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.710646639 +0000 UTC m=+149.979884285" watchObservedRunningTime="2025-09-30 12:24:18.746404991 +0000 UTC m=+150.015642637" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.746660 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lnjdx" podStartSLOduration=127.746655378 podStartE2EDuration="2m7.746655378s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.745506975 +0000 UTC m=+150.014744611" watchObservedRunningTime="2025-09-30 12:24:18.746655378 +0000 UTC m=+150.015893024" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.756470 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:18 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:18 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:18 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.756549 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.788896 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" podStartSLOduration=128.788857887 podStartE2EDuration="2m8.788857887s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.773620617 +0000 UTC m=+150.042858283" watchObservedRunningTime="2025-09-30 12:24:18.788857887 +0000 UTC m=+150.058095533" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.843897 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6t7qg" podStartSLOduration=128.843876345 podStartE2EDuration="2m8.843876345s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.843119713 +0000 UTC m=+150.112357359" watchObservedRunningTime="2025-09-30 12:24:18.843876345 +0000 UTC m=+150.113113991" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.866855 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.870701 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.370676569 +0000 UTC m=+150.639914215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.902089 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" podStartSLOduration=127.902065525 podStartE2EDuration="2m7.902065525s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.901516709 +0000 UTC m=+150.170754355" watchObservedRunningTime="2025-09-30 12:24:18.902065525 +0000 UTC m=+150.171303161" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.955718 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jwlqh" podStartSLOduration=128.955694543 podStartE2EDuration="2m8.955694543s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.928189149 +0000 UTC m=+150.197426805" watchObservedRunningTime="2025-09-30 12:24:18.955694543 +0000 UTC m=+150.224932189" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.958987 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dqtc" podStartSLOduration=127.958973698 podStartE2EDuration="2m7.958973698s" podCreationTimestamp="2025-09-30 12:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:18.955420665 +0000 UTC m=+150.224658331" watchObservedRunningTime="2025-09-30 12:24:18.958973698 +0000 UTC m=+150.228211344" Sep 30 12:24:18 crc kubenswrapper[4672]: I0930 12:24:18.971164 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:18 crc kubenswrapper[4672]: E0930 12:24:18.971765 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.471732406 +0000 UTC m=+150.740970052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.073309 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.073828 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.573809983 +0000 UTC m=+150.843047629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.176019 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.176289 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.67623278 +0000 UTC m=+150.945470426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.176453 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.176884 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.676875878 +0000 UTC m=+150.946113524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.277600 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.277838 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.777795352 +0000 UTC m=+151.047033008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.278055 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.278540 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.778531553 +0000 UTC m=+151.047769189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.378445 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.378594 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.378625 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.378725 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.878666823 +0000 UTC m=+151.147904469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.378995 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.379485 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.879474656 +0000 UTC m=+151.148712292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.379585 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.379667 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.384514 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.384529 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.385697 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.397053 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.481558 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.481759 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.981732128 +0000 UTC m=+151.250969764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.481796 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.482169 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:19.98216025 +0000 UTC m=+151.251397896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.546601 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.553834 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.556168 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z2tt9" event={"ID":"29a5441c-2a0c-443f-a7ab-47f2abc81a1c","Type":"ContainerStarted","Data":"611bf0414849a8715eb5296eb5ca11e008471e04d84b0a8e9c7a21bb5b3f3279"} Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.556975 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.572568 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" event={"ID":"076cdbe0-669c-4655-b2d1-8967456a6e62","Type":"ContainerStarted","Data":"1b3a7e183e380edcd3a420382b5763f4fe3014e8590354eb2c89c6e8a3fc6f1b"} Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.572604 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" event={"ID":"076cdbe0-669c-4655-b2d1-8967456a6e62","Type":"ContainerStarted","Data":"686397bc3923cf8515c427b2994d17f55845567af188784e93ea764a53276f72"} Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.575568 4672 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5wh44 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.575606 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" podUID="3d92f349-809a-4fdc-9006-420e736d856d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.582443 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.582517 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.082503697 +0000 UTC m=+151.351741333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.584180 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.585638 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.085618737 +0000 UTC m=+151.354856383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.586617 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mp8wz" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.587134 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5n4bs" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.587299 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.600574 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z2tt9" podStartSLOduration=9.600558188 podStartE2EDuration="9.600558188s" podCreationTimestamp="2025-09-30 12:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:19.591639721 +0000 UTC m=+150.860877357" watchObservedRunningTime="2025-09-30 12:24:19.600558188 +0000 UTC m=+150.869795834" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.642367 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.698836 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.700559 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.200527974 +0000 UTC m=+151.469765620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.776670 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:19 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:19 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:19 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.776734 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.804597 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.805229 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.305184306 +0000 UTC m=+151.574421952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.908057 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:19 crc kubenswrapper[4672]: E0930 12:24:19.908898 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.408878759 +0000 UTC m=+151.678116405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:19 crc kubenswrapper[4672]: I0930 12:24:19.986176 4672 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.010192 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:20 crc kubenswrapper[4672]: E0930 12:24:20.010607 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.510591395 +0000 UTC m=+151.779829041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.111051 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:20 crc kubenswrapper[4672]: E0930 12:24:20.111347 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.611330274 +0000 UTC m=+151.880567920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.212186 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:20 crc kubenswrapper[4672]: E0930 12:24:20.212512 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.712499194 +0000 UTC m=+151.981736840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.314056 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:20 crc kubenswrapper[4672]: E0930 12:24:20.315699 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.815675513 +0000 UTC m=+152.084913159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.417633 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:20 crc kubenswrapper[4672]: E0930 12:24:20.418105 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 12:24:20.918086469 +0000 UTC m=+152.187324115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mlrw8" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.517466 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-52587"] Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.518668 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.519169 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:20 crc kubenswrapper[4672]: E0930 12:24:20.519938 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 12:24:21.019919389 +0000 UTC m=+152.289157035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.526777 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52587"] Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.526794 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 12:24:20 crc kubenswrapper[4672]: W0930 12:24:20.588186 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cafdfa73f4c2f707e0427b96ceb3912546652f4502d8bb5c142ca63622ab6ddc WatchSource:0}: Error finding container cafdfa73f4c2f707e0427b96ceb3912546652f4502d8bb5c142ca63622ab6ddc: Status 404 returned error can't find the container with id cafdfa73f4c2f707e0427b96ceb3912546652f4502d8bb5c142ca63622ab6ddc Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.593620 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2685dcf330d0f0df8a193118661ccec425e6220b254f7b260ecdd9ecdd158bdf"} Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.593751 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"17ff372fdc1efac230bcf548b304dc3bb773812ea6e5b6fe78cc03db2b99e07e"} Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.609679 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" event={"ID":"076cdbe0-669c-4655-b2d1-8967456a6e62","Type":"ContainerStarted","Data":"6657f8264954a0b9fb194ca76f93434edc43cdb4c6b801970066ee43c532bc66"} Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.610855 4672 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T12:24:19.986213632Z","Handler":null,"Name":""} Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.618536 4672 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.619196 4672 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.619791 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a2ad880f5653de113f9c361b439564547cf795abb18d4f894c3112150ef34aaf"} Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.619840 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"df462ff5d437c7a7e25138ab134d82644a1fc2586cc0bd63b6b47d253834347f"} Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.620554 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.620637 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-catalog-content\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.620699 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ccjh\" (UniqueName: \"kubernetes.io/projected/42146b78-abcf-44bd-8a8d-6048d9fa0a01-kube-api-access-2ccjh\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.620734 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-utilities\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.627882 4672 generic.go:334] "Generic (PLEG): container finished" podID="2acf05e1-f152-4432-b0b2-a44b242d0308" containerID="2b5948e4f052ce096dd3fca3d3b8f8343dc8d91ed7163efd1bc1fbd512cd3647" exitCode=0 Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.627953 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" event={"ID":"2acf05e1-f152-4432-b0b2-a44b242d0308","Type":"ContainerDied","Data":"2b5948e4f052ce096dd3fca3d3b8f8343dc8d91ed7163efd1bc1fbd512cd3647"} Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.631133 4672 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.631172 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.664614 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-r4gx9" podStartSLOduration=10.664590154999999 podStartE2EDuration="10.664590155s" podCreationTimestamp="2025-09-30 12:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:20.659611242 +0000 UTC m=+151.928848888" watchObservedRunningTime="2025-09-30 12:24:20.664590155 +0000 UTC m=+151.933827801" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.729347 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mlrw8\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.731047 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-catalog-content\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.731153 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ccjh\" (UniqueName: \"kubernetes.io/projected/42146b78-abcf-44bd-8a8d-6048d9fa0a01-kube-api-access-2ccjh\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.731184 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-utilities\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.731672 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-utilities\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.738187 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-catalog-content\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.751988 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hflds"] Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.754977 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.759612 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.761295 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ccjh\" (UniqueName: \"kubernetes.io/projected/42146b78-abcf-44bd-8a8d-6048d9fa0a01-kube-api-access-2ccjh\") pod \"community-operators-52587\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.766101 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hflds"] Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.785531 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:20 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:20 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:20 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.785631 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.828620 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.832956 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.844429 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52587" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.921944 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gfws4"] Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.923181 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.935405 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96cvg\" (UniqueName: \"kubernetes.io/projected/fc47647e-d94f-4cca-8e3f-50cab43126ab-kube-api-access-96cvg\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.935485 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-catalog-content\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.935542 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-utilities\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.935692 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gfws4"] Sep 30 12:24:20 crc kubenswrapper[4672]: I0930 12:24:20.962715 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.013735 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5wh44" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.037362 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-utilities\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.037453 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96cvg\" (UniqueName: \"kubernetes.io/projected/fc47647e-d94f-4cca-8e3f-50cab43126ab-kube-api-access-96cvg\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.037557 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-catalog-content\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.037588 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-utilities\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.037616 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlv44\" (UniqueName: \"kubernetes.io/projected/3a02c3e3-f673-41d6-950c-0fef1950613e-kube-api-access-jlv44\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.037653 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-catalog-content\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.039029 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-utilities\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.039536 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-catalog-content\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.062011 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96cvg\" (UniqueName: \"kubernetes.io/projected/fc47647e-d94f-4cca-8e3f-50cab43126ab-kube-api-access-96cvg\") pod \"certified-operators-hflds\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.103127 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.112859 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7nwrz"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.115055 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.125674 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7nwrz"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.142443 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-utilities\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.142488 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlv44\" (UniqueName: \"kubernetes.io/projected/3a02c3e3-f673-41d6-950c-0fef1950613e-kube-api-access-jlv44\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.142531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-catalog-content\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.143610 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-catalog-content\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.143998 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-utilities\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.154987 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mlrw8"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.168608 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlv44\" (UniqueName: \"kubernetes.io/projected/3a02c3e3-f673-41d6-950c-0fef1950613e-kube-api-access-jlv44\") pod \"community-operators-gfws4\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.244084 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sn5r\" (UniqueName: \"kubernetes.io/projected/89089e3f-369b-4e92-b28b-369c2f3b3017-kube-api-access-6sn5r\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.244478 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-catalog-content\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.244502 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-utilities\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.249470 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.270424 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52587"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.348134 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sn5r\" (UniqueName: \"kubernetes.io/projected/89089e3f-369b-4e92-b28b-369c2f3b3017-kube-api-access-6sn5r\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.348236 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-catalog-content\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.348287 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-utilities\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.348836 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-utilities\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.348907 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-catalog-content\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.378100 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sn5r\" (UniqueName: \"kubernetes.io/projected/89089e3f-369b-4e92-b28b-369c2f3b3017-kube-api-access-6sn5r\") pod \"certified-operators-7nwrz\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.401834 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.408657 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bj7dq" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.438297 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.452014 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.478426 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hflds"] Sep 30 12:24:21 crc kubenswrapper[4672]: W0930 12:24:21.485634 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc47647e_d94f_4cca_8e3f_50cab43126ab.slice/crio-3fb96796f81a7efde267ccdec4b8040ca014854970537805925f7c4a8d9dab7e WatchSource:0}: Error finding container 3fb96796f81a7efde267ccdec4b8040ca014854970537805925f7c4a8d9dab7e: Status 404 returned error can't find the container with id 3fb96796f81a7efde267ccdec4b8040ca014854970537805925f7c4a8d9dab7e Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.646743 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gfws4"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.650284 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hflds" event={"ID":"fc47647e-d94f-4cca-8e3f-50cab43126ab","Type":"ContainerStarted","Data":"3fb96796f81a7efde267ccdec4b8040ca014854970537805925f7c4a8d9dab7e"} Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.652246 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"344f11f0c2299e13138b6da1e19ab653325f69a1ae38401cc288d31b3bec4820"} Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.652325 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cafdfa73f4c2f707e0427b96ceb3912546652f4502d8bb5c142ca63622ab6ddc"} Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.654408 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.669360 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" event={"ID":"59e30e9f-8395-4abd-8adf-b964ac1cbb0b","Type":"ContainerStarted","Data":"95c658cf4db73f2f4f16675700967a9012c555267cd52910ee18d6c8d3b0d17f"} Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.669426 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" event={"ID":"59e30e9f-8395-4abd-8adf-b964ac1cbb0b","Type":"ContainerStarted","Data":"046633e0e383dde4b9d62aeb6a13873e816c6bdb74bde3cb06316a537544f5ba"} Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.669801 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.700040 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.700165 4672 generic.go:334] "Generic (PLEG): container finished" podID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerID="ce5ef87226c77462c432873c138486a80f3c3a7a0878b22325ccb202712bf8ef" exitCode=0 Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.709340 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52587" event={"ID":"42146b78-abcf-44bd-8a8d-6048d9fa0a01","Type":"ContainerDied","Data":"ce5ef87226c77462c432873c138486a80f3c3a7a0878b22325ccb202712bf8ef"} Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.709386 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52587" event={"ID":"42146b78-abcf-44bd-8a8d-6048d9fa0a01","Type":"ContainerStarted","Data":"aa5ccf906ad5517fd4c28e14cd8a4d521c234a25f84d109a6c2e500164cb5a51"} Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.719768 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.724015 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.724241 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.725991 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.726642 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.735066 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" podStartSLOduration=131.734993096 podStartE2EDuration="2m11.734993096s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:21.72056972 +0000 UTC m=+152.989807386" watchObservedRunningTime="2025-09-30 12:24:21.734993096 +0000 UTC m=+153.004230752" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.757601 4672 patch_prober.go:28] interesting pod/router-default-5444994796-52jss container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 12:24:21 crc kubenswrapper[4672]: [-]has-synced failed: reason withheld Sep 30 12:24:21 crc kubenswrapper[4672]: [+]process-running ok Sep 30 12:24:21 crc kubenswrapper[4672]: healthz check failed Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.757661 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-52jss" podUID="b11da5bc-b91d-4a8e-8839-da0f3989618e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.858898 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2fa305-9e65-4aee-9842-fe3274aa306f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.859064 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2fa305-9e65-4aee-9842-fe3274aa306f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.872595 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7nwrz"] Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.967054 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2fa305-9e65-4aee-9842-fe3274aa306f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.967152 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2fa305-9e65-4aee-9842-fe3274aa306f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.967239 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2fa305-9e65-4aee-9842-fe3274aa306f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:21 crc kubenswrapper[4672]: I0930 12:24:21.992505 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2fa305-9e65-4aee-9842-fe3274aa306f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.065390 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.092671 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.170547 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2acf05e1-f152-4432-b0b2-a44b242d0308-secret-volume\") pod \"2acf05e1-f152-4432-b0b2-a44b242d0308\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.170646 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h5w6\" (UniqueName: \"kubernetes.io/projected/2acf05e1-f152-4432-b0b2-a44b242d0308-kube-api-access-4h5w6\") pod \"2acf05e1-f152-4432-b0b2-a44b242d0308\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.170769 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2acf05e1-f152-4432-b0b2-a44b242d0308-config-volume\") pod \"2acf05e1-f152-4432-b0b2-a44b242d0308\" (UID: \"2acf05e1-f152-4432-b0b2-a44b242d0308\") " Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.172142 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acf05e1-f152-4432-b0b2-a44b242d0308-config-volume" (OuterVolumeSpecName: "config-volume") pod "2acf05e1-f152-4432-b0b2-a44b242d0308" (UID: "2acf05e1-f152-4432-b0b2-a44b242d0308"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.175316 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2acf05e1-f152-4432-b0b2-a44b242d0308-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2acf05e1-f152-4432-b0b2-a44b242d0308" (UID: "2acf05e1-f152-4432-b0b2-a44b242d0308"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.175732 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acf05e1-f152-4432-b0b2-a44b242d0308-kube-api-access-4h5w6" (OuterVolumeSpecName: "kube-api-access-4h5w6") pod "2acf05e1-f152-4432-b0b2-a44b242d0308" (UID: "2acf05e1-f152-4432-b0b2-a44b242d0308"). InnerVolumeSpecName "kube-api-access-4h5w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.216718 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rj74q" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.272443 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2acf05e1-f152-4432-b0b2-a44b242d0308-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.272485 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h5w6\" (UniqueName: \"kubernetes.io/projected/2acf05e1-f152-4432-b0b2-a44b242d0308-kube-api-access-4h5w6\") on node \"crc\" DevicePath \"\"" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.272501 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2acf05e1-f152-4432-b0b2-a44b242d0308-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.305078 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 12:24:22 crc kubenswrapper[4672]: W0930 12:24:22.438949 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc2fa305_9e65_4aee_9842_fe3274aa306f.slice/crio-ca15568a56424f8a0b5d441024bec670173bc918b3a777cde5a279adcdbf768b WatchSource:0}: Error finding container ca15568a56424f8a0b5d441024bec670173bc918b3a777cde5a279adcdbf768b: Status 404 returned error can't find the container with id ca15568a56424f8a0b5d441024bec670173bc918b3a777cde5a279adcdbf768b Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.689852 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-dlbv7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.689869 4672 patch_prober.go:28] interesting pod/downloads-7954f5f757-dlbv7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.689950 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dlbv7" podUID="7fb8bbc0-c63a-4ab5-b454-13682563fe31" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.690133 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dlbv7" podUID="7fb8bbc0-c63a-4ab5-b454-13682563fe31" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.705059 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gvbdw"] Sep 30 12:24:22 crc kubenswrapper[4672]: E0930 12:24:22.705448 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acf05e1-f152-4432-b0b2-a44b242d0308" containerName="collect-profiles" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.705472 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acf05e1-f152-4432-b0b2-a44b242d0308" containerName="collect-profiles" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.705679 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acf05e1-f152-4432-b0b2-a44b242d0308" containerName="collect-profiles" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.706612 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.708962 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.713219 4672 generic.go:334] "Generic (PLEG): container finished" podID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerID="e353d583cff2f10befd47bf90c35fe72c2b61bcf5fac8d60711ac0cad0fa7479" exitCode=0 Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.713298 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hflds" event={"ID":"fc47647e-d94f-4cca-8e3f-50cab43126ab","Type":"ContainerDied","Data":"e353d583cff2f10befd47bf90c35fe72c2b61bcf5fac8d60711ac0cad0fa7479"} Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.717193 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.717182 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2" event={"ID":"2acf05e1-f152-4432-b0b2-a44b242d0308","Type":"ContainerDied","Data":"162380361df785c6cdd425bd9caca58689c46083f9626f7fb29a7dcce520c37e"} Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.717323 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="162380361df785c6cdd425bd9caca58689c46083f9626f7fb29a7dcce520c37e" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.718313 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvbdw"] Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.719847 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bc2fa305-9e65-4aee-9842-fe3274aa306f","Type":"ContainerStarted","Data":"ca15568a56424f8a0b5d441024bec670173bc918b3a777cde5a279adcdbf768b"} Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.721795 4672 generic.go:334] "Generic (PLEG): container finished" podID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerID="096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9" exitCode=0 Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.721869 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfws4" event={"ID":"3a02c3e3-f673-41d6-950c-0fef1950613e","Type":"ContainerDied","Data":"096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9"} Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.721905 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfws4" event={"ID":"3a02c3e3-f673-41d6-950c-0fef1950613e","Type":"ContainerStarted","Data":"ab4d79f9013ece56dcdabb3aa576a78ec1708fee33c800f1bbb9423b0d8a37a8"} Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.724022 4672 generic.go:334] "Generic (PLEG): container finished" podID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerID="4652b7b0e7e489a38e939d4fe0891a0cd714eae23ec04b3a0a01a87290b6ba22" exitCode=0 Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.724777 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwrz" event={"ID":"89089e3f-369b-4e92-b28b-369c2f3b3017","Type":"ContainerDied","Data":"4652b7b0e7e489a38e939d4fe0891a0cd714eae23ec04b3a0a01a87290b6ba22"} Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.724800 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwrz" event={"ID":"89089e3f-369b-4e92-b28b-369c2f3b3017","Type":"ContainerStarted","Data":"fe8c1560e4ddc803dd4045c417cdc733ee66700c9aeab238a136c9b58af1095f"} Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.761808 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.768563 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.883242 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-catalog-content\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.883765 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72pwh\" (UniqueName: \"kubernetes.io/projected/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-kube-api-access-72pwh\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.883811 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-utilities\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.985857 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-catalog-content\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.985961 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72pwh\" (UniqueName: \"kubernetes.io/projected/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-kube-api-access-72pwh\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.985997 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-utilities\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.986484 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-catalog-content\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:22 crc kubenswrapper[4672]: I0930 12:24:22.986626 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-utilities\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.030285 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72pwh\" (UniqueName: \"kubernetes.io/projected/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-kube-api-access-72pwh\") pod \"redhat-marketplace-gvbdw\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.077971 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.121415 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p4c"] Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.122835 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.130693 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p4c"] Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.189289 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh8q5\" (UniqueName: \"kubernetes.io/projected/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-kube-api-access-wh8q5\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.189379 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-catalog-content\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.189460 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-utilities\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.291919 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh8q5\" (UniqueName: \"kubernetes.io/projected/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-kube-api-access-wh8q5\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.292384 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-catalog-content\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.292593 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-utilities\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.293214 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-utilities\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.293703 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-catalog-content\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.301611 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.301648 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.312911 4672 patch_prober.go:28] interesting pod/console-f9d7485db-x8stp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.312976 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-x8stp" podUID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.330966 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh8q5\" (UniqueName: \"kubernetes.io/projected/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-kube-api-access-wh8q5\") pod \"redhat-marketplace-k9p4c\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.459789 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvbdw"] Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.468122 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:24:23 crc kubenswrapper[4672]: W0930 12:24:23.472051 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac79497a_7f46_4ffd_9a7f_40153ab89a6d.slice/crio-c49a97d41fd5a70773794191be2d352bc23f3f359ab84b9c54e0b1de94813bc6 WatchSource:0}: Error finding container c49a97d41fd5a70773794191be2d352bc23f3f359ab84b9c54e0b1de94813bc6: Status 404 returned error can't find the container with id c49a97d41fd5a70773794191be2d352bc23f3f359ab84b9c54e0b1de94813bc6 Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.711694 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5h54f"] Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.714686 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.717763 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.733048 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5h54f"] Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.745484 4672 generic.go:334] "Generic (PLEG): container finished" podID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerID="78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca" exitCode=0 Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.745597 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvbdw" event={"ID":"ac79497a-7f46-4ffd-9a7f-40153ab89a6d","Type":"ContainerDied","Data":"78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca"} Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.745661 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvbdw" event={"ID":"ac79497a-7f46-4ffd-9a7f-40153ab89a6d","Type":"ContainerStarted","Data":"c49a97d41fd5a70773794191be2d352bc23f3f359ab84b9c54e0b1de94813bc6"} Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.748667 4672 generic.go:334] "Generic (PLEG): container finished" podID="bc2fa305-9e65-4aee-9842-fe3274aa306f" containerID="04c9813bbb55819d6825a6790ae755525f04544c23f77a501cf1299e251ef8ea" exitCode=0 Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.749922 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bc2fa305-9e65-4aee-9842-fe3274aa306f","Type":"ContainerDied","Data":"04c9813bbb55819d6825a6790ae755525f04544c23f77a501cf1299e251ef8ea"} Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.757394 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-52jss" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.808659 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-catalog-content\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.808739 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mst6\" (UniqueName: \"kubernetes.io/projected/3588660c-1eb4-4173-87c4-7bf4f6200851-kube-api-access-8mst6\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.808777 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-utilities\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.910747 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-catalog-content\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.910877 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mst6\" (UniqueName: \"kubernetes.io/projected/3588660c-1eb4-4173-87c4-7bf4f6200851-kube-api-access-8mst6\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.910974 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-utilities\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.912180 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-catalog-content\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.912196 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-utilities\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.922914 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p4c"] Sep 30 12:24:23 crc kubenswrapper[4672]: I0930 12:24:23.943871 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mst6\" (UniqueName: \"kubernetes.io/projected/3588660c-1eb4-4173-87c4-7bf4f6200851-kube-api-access-8mst6\") pod \"redhat-operators-5h54f\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:23 crc kubenswrapper[4672]: W0930 12:24:23.954802 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0d02c8d_95c7_4bd2_9955_f633cfea5e4e.slice/crio-b8a070215a388ccac83c2fc26a0ea1a6a23747cc807ee8c47857f57cbe72e03d WatchSource:0}: Error finding container b8a070215a388ccac83c2fc26a0ea1a6a23747cc807ee8c47857f57cbe72e03d: Status 404 returned error can't find the container with id b8a070215a388ccac83c2fc26a0ea1a6a23747cc807ee8c47857f57cbe72e03d Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.047032 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.106995 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vkwtk"] Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.108067 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.118709 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkwtk"] Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.221601 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qghl8\" (UniqueName: \"kubernetes.io/projected/6cff532e-6e66-4d8f-9e9a-44a47aabc381-kube-api-access-qghl8\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.222067 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-catalog-content\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.222225 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-utilities\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.323757 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qghl8\" (UniqueName: \"kubernetes.io/projected/6cff532e-6e66-4d8f-9e9a-44a47aabc381-kube-api-access-qghl8\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.323864 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-catalog-content\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.323905 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-utilities\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.324738 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-utilities\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.325413 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-catalog-content\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.345422 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qghl8\" (UniqueName: \"kubernetes.io/projected/6cff532e-6e66-4d8f-9e9a-44a47aabc381-kube-api-access-qghl8\") pod \"redhat-operators-vkwtk\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.376827 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5h54f"] Sep 30 12:24:24 crc kubenswrapper[4672]: W0930 12:24:24.388024 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3588660c_1eb4_4173_87c4_7bf4f6200851.slice/crio-8fa66ada2f9ef5befafd18480ec3a6754a119fcf8696c27e7038c7182669e9be WatchSource:0}: Error finding container 8fa66ada2f9ef5befafd18480ec3a6754a119fcf8696c27e7038c7182669e9be: Status 404 returned error can't find the container with id 8fa66ada2f9ef5befafd18480ec3a6754a119fcf8696c27e7038c7182669e9be Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.450011 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.742159 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.742223 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.758081 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkwtk"] Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.775711 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5h54f" event={"ID":"3588660c-1eb4-4173-87c4-7bf4f6200851","Type":"ContainerStarted","Data":"8fa66ada2f9ef5befafd18480ec3a6754a119fcf8696c27e7038c7182669e9be"} Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.807016 4672 generic.go:334] "Generic (PLEG): container finished" podID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerID="f4814398c6767c0b883350f9e9eb226142f248c0763b4c6579ef8f4c327350fe" exitCode=0 Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.807304 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p4c" event={"ID":"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e","Type":"ContainerDied","Data":"f4814398c6767c0b883350f9e9eb226142f248c0763b4c6579ef8f4c327350fe"} Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.807382 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p4c" event={"ID":"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e","Type":"ContainerStarted","Data":"b8a070215a388ccac83c2fc26a0ea1a6a23747cc807ee8c47857f57cbe72e03d"} Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.979940 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.981051 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.984239 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.984462 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 12:24:24 crc kubenswrapper[4672]: I0930 12:24:24.992352 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.147511 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaae3fd-43c6-47fe-9374-7b42028645f9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.147556 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaae3fd-43c6-47fe-9374-7b42028645f9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.249007 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaae3fd-43c6-47fe-9374-7b42028645f9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.249065 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaae3fd-43c6-47fe-9374-7b42028645f9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.249193 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaae3fd-43c6-47fe-9374-7b42028645f9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.318799 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaae3fd-43c6-47fe-9374-7b42028645f9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.398151 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.418172 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.560850 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2fa305-9e65-4aee-9842-fe3274aa306f-kube-api-access\") pod \"bc2fa305-9e65-4aee-9842-fe3274aa306f\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.561018 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2fa305-9e65-4aee-9842-fe3274aa306f-kubelet-dir\") pod \"bc2fa305-9e65-4aee-9842-fe3274aa306f\" (UID: \"bc2fa305-9e65-4aee-9842-fe3274aa306f\") " Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.561076 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc2fa305-9e65-4aee-9842-fe3274aa306f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc2fa305-9e65-4aee-9842-fe3274aa306f" (UID: "bc2fa305-9e65-4aee-9842-fe3274aa306f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.561373 4672 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc2fa305-9e65-4aee-9842-fe3274aa306f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.575085 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2fa305-9e65-4aee-9842-fe3274aa306f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc2fa305-9e65-4aee-9842-fe3274aa306f" (UID: "bc2fa305-9e65-4aee-9842-fe3274aa306f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.662523 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc2fa305-9e65-4aee-9842-fe3274aa306f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.826463 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.843048 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bc2fa305-9e65-4aee-9842-fe3274aa306f","Type":"ContainerDied","Data":"ca15568a56424f8a0b5d441024bec670173bc918b3a777cde5a279adcdbf768b"} Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.843091 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca15568a56424f8a0b5d441024bec670173bc918b3a777cde5a279adcdbf768b" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.843155 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.869460 4672 generic.go:334] "Generic (PLEG): container finished" podID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerID="7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078" exitCode=0 Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.869524 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5h54f" event={"ID":"3588660c-1eb4-4173-87c4-7bf4f6200851","Type":"ContainerDied","Data":"7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078"} Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.922859 4672 generic.go:334] "Generic (PLEG): container finished" podID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerID="ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403" exitCode=0 Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.922995 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkwtk" event={"ID":"6cff532e-6e66-4d8f-9e9a-44a47aabc381","Type":"ContainerDied","Data":"ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403"} Sep 30 12:24:25 crc kubenswrapper[4672]: I0930 12:24:25.923209 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkwtk" event={"ID":"6cff532e-6e66-4d8f-9e9a-44a47aabc381","Type":"ContainerStarted","Data":"9663019dbf5957b590254956eae638165384f2b420358ea2ec0a4a2ad7c320e8"} Sep 30 12:24:26 crc kubenswrapper[4672]: I0930 12:24:26.272925 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:24:26 crc kubenswrapper[4672]: I0930 12:24:26.956411 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaae3fd-43c6-47fe-9374-7b42028645f9","Type":"ContainerStarted","Data":"a157a55f2fdbb9e9fd30b85154bc1d4fc330451fa4a884ec015e1594452f51a2"} Sep 30 12:24:26 crc kubenswrapper[4672]: I0930 12:24:26.956751 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaae3fd-43c6-47fe-9374-7b42028645f9","Type":"ContainerStarted","Data":"55b77b4eed76a54bb03e708d3ccc8f0a223b1a893835ec8da72c3fa8658bdf30"} Sep 30 12:24:26 crc kubenswrapper[4672]: I0930 12:24:26.977492 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.977473938 podStartE2EDuration="2.977473938s" podCreationTimestamp="2025-09-30 12:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:26.976078177 +0000 UTC m=+158.245315823" watchObservedRunningTime="2025-09-30 12:24:26.977473938 +0000 UTC m=+158.246711584" Sep 30 12:24:27 crc kubenswrapper[4672]: I0930 12:24:27.979906 4672 generic.go:334] "Generic (PLEG): container finished" podID="8eaae3fd-43c6-47fe-9374-7b42028645f9" containerID="a157a55f2fdbb9e9fd30b85154bc1d4fc330451fa4a884ec015e1594452f51a2" exitCode=0 Sep 30 12:24:27 crc kubenswrapper[4672]: I0930 12:24:27.980177 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaae3fd-43c6-47fe-9374-7b42028645f9","Type":"ContainerDied","Data":"a157a55f2fdbb9e9fd30b85154bc1d4fc330451fa4a884ec015e1594452f51a2"} Sep 30 12:24:28 crc kubenswrapper[4672]: I0930 12:24:28.549946 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z2tt9" Sep 30 12:24:29 crc kubenswrapper[4672]: I0930 12:24:29.315178 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:29 crc kubenswrapper[4672]: I0930 12:24:29.437331 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaae3fd-43c6-47fe-9374-7b42028645f9-kubelet-dir\") pod \"8eaae3fd-43c6-47fe-9374-7b42028645f9\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " Sep 30 12:24:29 crc kubenswrapper[4672]: I0930 12:24:29.437388 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaae3fd-43c6-47fe-9374-7b42028645f9-kube-api-access\") pod \"8eaae3fd-43c6-47fe-9374-7b42028645f9\" (UID: \"8eaae3fd-43c6-47fe-9374-7b42028645f9\") " Sep 30 12:24:29 crc kubenswrapper[4672]: I0930 12:24:29.437832 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaae3fd-43c6-47fe-9374-7b42028645f9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8eaae3fd-43c6-47fe-9374-7b42028645f9" (UID: "8eaae3fd-43c6-47fe-9374-7b42028645f9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:24:29 crc kubenswrapper[4672]: I0930 12:24:29.443799 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaae3fd-43c6-47fe-9374-7b42028645f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8eaae3fd-43c6-47fe-9374-7b42028645f9" (UID: "8eaae3fd-43c6-47fe-9374-7b42028645f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:24:29 crc kubenswrapper[4672]: I0930 12:24:29.539002 4672 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaae3fd-43c6-47fe-9374-7b42028645f9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 12:24:29 crc kubenswrapper[4672]: I0930 12:24:29.539030 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaae3fd-43c6-47fe-9374-7b42028645f9-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 12:24:30 crc kubenswrapper[4672]: I0930 12:24:30.026368 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaae3fd-43c6-47fe-9374-7b42028645f9","Type":"ContainerDied","Data":"55b77b4eed76a54bb03e708d3ccc8f0a223b1a893835ec8da72c3fa8658bdf30"} Sep 30 12:24:30 crc kubenswrapper[4672]: I0930 12:24:30.026691 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55b77b4eed76a54bb03e708d3ccc8f0a223b1a893835ec8da72c3fa8658bdf30" Sep 30 12:24:30 crc kubenswrapper[4672]: I0930 12:24:30.026477 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 12:24:32 crc kubenswrapper[4672]: I0930 12:24:32.696229 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dlbv7" Sep 30 12:24:33 crc kubenswrapper[4672]: I0930 12:24:33.300588 4672 patch_prober.go:28] interesting pod/console-f9d7485db-x8stp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Sep 30 12:24:33 crc kubenswrapper[4672]: I0930 12:24:33.300796 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-x8stp" podUID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Sep 30 12:24:34 crc kubenswrapper[4672]: I0930 12:24:34.016845 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:24:34 crc kubenswrapper[4672]: I0930 12:24:34.022372 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42618cd5-d9f9-45ba-8081-660ca47bebf4-metrics-certs\") pod \"network-metrics-daemon-n7wwp\" (UID: \"42618cd5-d9f9-45ba-8081-660ca47bebf4\") " pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:24:34 crc kubenswrapper[4672]: I0930 12:24:34.237669 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7wwp" Sep 30 12:24:40 crc kubenswrapper[4672]: I0930 12:24:40.837248 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:24:43 crc kubenswrapper[4672]: I0930 12:24:43.305419 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:43 crc kubenswrapper[4672]: I0930 12:24:43.310015 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:24:53 crc kubenswrapper[4672]: E0930 12:24:53.486834 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 12:24:53 crc kubenswrapper[4672]: E0930 12:24:53.487677 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2ccjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-52587_openshift-marketplace(42146b78-abcf-44bd-8a8d-6048d9fa0a01): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 12:24:53 crc kubenswrapper[4672]: E0930 12:24:53.488844 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-52587" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" Sep 30 12:24:53 crc kubenswrapper[4672]: I0930 12:24:53.763033 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-84vwh" Sep 30 12:24:54 crc kubenswrapper[4672]: E0930 12:24:54.219592 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-52587" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" Sep 30 12:24:54 crc kubenswrapper[4672]: I0930 12:24:54.677869 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n7wwp"] Sep 30 12:24:54 crc kubenswrapper[4672]: I0930 12:24:54.740062 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:24:54 crc kubenswrapper[4672]: I0930 12:24:54.740128 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:24:54 crc kubenswrapper[4672]: W0930 12:24:54.897507 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42618cd5_d9f9_45ba_8081_660ca47bebf4.slice/crio-7c38e93b90a5ce24f820d644829846ae158c20eb52cfe0a6907343dff20ca3bc WatchSource:0}: Error finding container 7c38e93b90a5ce24f820d644829846ae158c20eb52cfe0a6907343dff20ca3bc: Status 404 returned error can't find the container with id 7c38e93b90a5ce24f820d644829846ae158c20eb52cfe0a6907343dff20ca3bc Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.191867 4672 generic.go:334] "Generic (PLEG): container finished" podID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerID="e9ee9fefc1a32502e8c2796837715948ac2fbb5a5098456337d2d5b0e2767109" exitCode=0 Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.191930 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwrz" event={"ID":"89089e3f-369b-4e92-b28b-369c2f3b3017","Type":"ContainerDied","Data":"e9ee9fefc1a32502e8c2796837715948ac2fbb5a5098456337d2d5b0e2767109"} Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.194351 4672 generic.go:334] "Generic (PLEG): container finished" podID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerID="56d6f0db667bbd7a1303af5437346b22f735aec4c2a2e35e5a575fd931f1a058" exitCode=0 Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.194458 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hflds" event={"ID":"fc47647e-d94f-4cca-8e3f-50cab43126ab","Type":"ContainerDied","Data":"56d6f0db667bbd7a1303af5437346b22f735aec4c2a2e35e5a575fd931f1a058"} Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.196858 4672 generic.go:334] "Generic (PLEG): container finished" podID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerID="d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c" exitCode=0 Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.196955 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5h54f" event={"ID":"3588660c-1eb4-4173-87c4-7bf4f6200851","Type":"ContainerDied","Data":"d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c"} Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.199430 4672 generic.go:334] "Generic (PLEG): container finished" podID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerID="a1eac219c05a04a3b511b71abb77b2ab3d18f9ca832debbea421bdc52b7bc4b4" exitCode=0 Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.199480 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p4c" event={"ID":"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e","Type":"ContainerDied","Data":"a1eac219c05a04a3b511b71abb77b2ab3d18f9ca832debbea421bdc52b7bc4b4"} Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.203217 4672 generic.go:334] "Generic (PLEG): container finished" podID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerID="b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b" exitCode=0 Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.203275 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvbdw" event={"ID":"ac79497a-7f46-4ffd-9a7f-40153ab89a6d","Type":"ContainerDied","Data":"b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b"} Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.206401 4672 generic.go:334] "Generic (PLEG): container finished" podID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerID="c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5" exitCode=0 Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.206479 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkwtk" event={"ID":"6cff532e-6e66-4d8f-9e9a-44a47aabc381","Type":"ContainerDied","Data":"c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5"} Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.219379 4672 generic.go:334] "Generic (PLEG): container finished" podID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerID="d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f" exitCode=0 Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.219452 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfws4" event={"ID":"3a02c3e3-f673-41d6-950c-0fef1950613e","Type":"ContainerDied","Data":"d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f"} Sep 30 12:24:55 crc kubenswrapper[4672]: I0930 12:24:55.224029 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" event={"ID":"42618cd5-d9f9-45ba-8081-660ca47bebf4","Type":"ContainerStarted","Data":"7c38e93b90a5ce24f820d644829846ae158c20eb52cfe0a6907343dff20ca3bc"} Sep 30 12:24:56 crc kubenswrapper[4672]: I0930 12:24:56.233744 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" event={"ID":"42618cd5-d9f9-45ba-8081-660ca47bebf4","Type":"ContainerStarted","Data":"0f08eabbbc4d9978921b751b6d6c36fecd96ca92128b06c8ef41449cd2ae1992"} Sep 30 12:24:56 crc kubenswrapper[4672]: I0930 12:24:56.234165 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7wwp" event={"ID":"42618cd5-d9f9-45ba-8081-660ca47bebf4","Type":"ContainerStarted","Data":"0b817b3b5078ab60564f80708bdd27b28f00a5fecb1c5a09f0e083a181fd7d5e"} Sep 30 12:24:56 crc kubenswrapper[4672]: I0930 12:24:56.274225 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n7wwp" podStartSLOduration=166.274203534 podStartE2EDuration="2m46.274203534s" podCreationTimestamp="2025-09-30 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:24:56.266475651 +0000 UTC m=+187.535713297" watchObservedRunningTime="2025-09-30 12:24:56.274203534 +0000 UTC m=+187.543441190" Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.253450 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfws4" event={"ID":"3a02c3e3-f673-41d6-950c-0fef1950613e","Type":"ContainerStarted","Data":"09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34"} Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.255943 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwrz" event={"ID":"89089e3f-369b-4e92-b28b-369c2f3b3017","Type":"ContainerStarted","Data":"a2a1a39f9ddca75cb82ba3c0f5dc242cf2634cdab0c48b7b6944e077ace1c8c7"} Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.259068 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hflds" event={"ID":"fc47647e-d94f-4cca-8e3f-50cab43126ab","Type":"ContainerStarted","Data":"acaf7ec96bfc15124aea7271441724d63e065e183d4557d1ca8ba5e828e1eb55"} Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.262455 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5h54f" event={"ID":"3588660c-1eb4-4173-87c4-7bf4f6200851","Type":"ContainerStarted","Data":"a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d"} Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.265011 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p4c" event={"ID":"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e","Type":"ContainerStarted","Data":"a3730022e647d5383f38df0f3c2b009534667be8ec815b36ea6a5a011c3f1348"} Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.269081 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvbdw" event={"ID":"ac79497a-7f46-4ffd-9a7f-40153ab89a6d","Type":"ContainerStarted","Data":"2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854"} Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.278750 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gfws4" podStartSLOduration=3.66351475 podStartE2EDuration="37.278715343s" podCreationTimestamp="2025-09-30 12:24:20 +0000 UTC" firstStartedPulling="2025-09-30 12:24:22.723251146 +0000 UTC m=+153.992488792" lastFinishedPulling="2025-09-30 12:24:56.338451729 +0000 UTC m=+187.607689385" observedRunningTime="2025-09-30 12:24:57.274550323 +0000 UTC m=+188.543787979" watchObservedRunningTime="2025-09-30 12:24:57.278715343 +0000 UTC m=+188.547953029" Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.279778 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkwtk" event={"ID":"6cff532e-6e66-4d8f-9e9a-44a47aabc381","Type":"ContainerStarted","Data":"b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220"} Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.293825 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hflds" podStartSLOduration=3.247131689 podStartE2EDuration="37.293801088s" podCreationTimestamp="2025-09-30 12:24:20 +0000 UTC" firstStartedPulling="2025-09-30 12:24:22.71544553 +0000 UTC m=+153.984683176" lastFinishedPulling="2025-09-30 12:24:56.762114929 +0000 UTC m=+188.031352575" observedRunningTime="2025-09-30 12:24:57.292059298 +0000 UTC m=+188.561297014" watchObservedRunningTime="2025-09-30 12:24:57.293801088 +0000 UTC m=+188.563038744" Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.309452 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gvbdw" podStartSLOduration=2.591038095 podStartE2EDuration="35.309418109s" podCreationTimestamp="2025-09-30 12:24:22 +0000 UTC" firstStartedPulling="2025-09-30 12:24:23.748966626 +0000 UTC m=+155.018204272" lastFinishedPulling="2025-09-30 12:24:56.46734664 +0000 UTC m=+187.736584286" observedRunningTime="2025-09-30 12:24:57.308559325 +0000 UTC m=+188.577797021" watchObservedRunningTime="2025-09-30 12:24:57.309418109 +0000 UTC m=+188.578655765" Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.359976 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7nwrz" podStartSLOduration=2.8481757009999997 podStartE2EDuration="36.359958768s" podCreationTimestamp="2025-09-30 12:24:21 +0000 UTC" firstStartedPulling="2025-09-30 12:24:22.726046907 +0000 UTC m=+153.995284553" lastFinishedPulling="2025-09-30 12:24:56.237829974 +0000 UTC m=+187.507067620" observedRunningTime="2025-09-30 12:24:57.357473367 +0000 UTC m=+188.626711033" watchObservedRunningTime="2025-09-30 12:24:57.359958768 +0000 UTC m=+188.629196414" Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.361947 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k9p4c" podStartSLOduration=2.722583772 podStartE2EDuration="34.361940966s" podCreationTimestamp="2025-09-30 12:24:23 +0000 UTC" firstStartedPulling="2025-09-30 12:24:24.853745969 +0000 UTC m=+156.122983615" lastFinishedPulling="2025-09-30 12:24:56.493103163 +0000 UTC m=+187.762340809" observedRunningTime="2025-09-30 12:24:57.330055555 +0000 UTC m=+188.599293251" watchObservedRunningTime="2025-09-30 12:24:57.361940966 +0000 UTC m=+188.631178602" Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.383859 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5h54f" podStartSLOduration=3.9966180319999998 podStartE2EDuration="34.383830488s" podCreationTimestamp="2025-09-30 12:24:23 +0000 UTC" firstStartedPulling="2025-09-30 12:24:25.871029677 +0000 UTC m=+157.140267323" lastFinishedPulling="2025-09-30 12:24:56.258242113 +0000 UTC m=+187.527479779" observedRunningTime="2025-09-30 12:24:57.381885181 +0000 UTC m=+188.651122867" watchObservedRunningTime="2025-09-30 12:24:57.383830488 +0000 UTC m=+188.653068144" Sep 30 12:24:57 crc kubenswrapper[4672]: I0930 12:24:57.418467 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vkwtk" podStartSLOduration=2.4109539939999998 podStartE2EDuration="33.418444527s" podCreationTimestamp="2025-09-30 12:24:24 +0000 UTC" firstStartedPulling="2025-09-30 12:24:25.944467347 +0000 UTC m=+157.213704993" lastFinishedPulling="2025-09-30 12:24:56.95195788 +0000 UTC m=+188.221195526" observedRunningTime="2025-09-30 12:24:57.411147886 +0000 UTC m=+188.680385532" watchObservedRunningTime="2025-09-30 12:24:57.418444527 +0000 UTC m=+188.687682173" Sep 30 12:24:59 crc kubenswrapper[4672]: I0930 12:24:59.660642 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.105203 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.105291 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.250704 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.251081 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.260710 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.309246 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.347334 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.353035 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.453156 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.453205 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:25:01 crc kubenswrapper[4672]: I0930 12:25:01.495112 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:25:02 crc kubenswrapper[4672]: I0930 12:25:02.371724 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.078918 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.079006 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.125756 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.318452 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7nwrz"] Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.384635 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.468847 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.468916 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.513939 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gfws4"] Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.514302 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gfws4" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="registry-server" containerID="cri-o://09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34" gracePeriod=2 Sep 30 12:25:03 crc kubenswrapper[4672]: I0930 12:25:03.517981 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.048132 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.048213 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.113462 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.322325 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7nwrz" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="registry-server" containerID="cri-o://a2a1a39f9ddca75cb82ba3c0f5dc242cf2634cdab0c48b7b6944e077ace1c8c7" gracePeriod=2 Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.379573 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.388684 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.451561 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.451625 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:25:04 crc kubenswrapper[4672]: I0930 12:25:04.511915 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.338404 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.366239 4672 generic.go:334] "Generic (PLEG): container finished" podID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerID="09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34" exitCode=0 Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.366312 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfws4" event={"ID":"3a02c3e3-f673-41d6-950c-0fef1950613e","Type":"ContainerDied","Data":"09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34"} Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.366372 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfws4" event={"ID":"3a02c3e3-f673-41d6-950c-0fef1950613e","Type":"ContainerDied","Data":"ab4d79f9013ece56dcdabb3aa576a78ec1708fee33c800f1bbb9423b0d8a37a8"} Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.366397 4672 scope.go:117] "RemoveContainer" containerID="09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.371847 4672 generic.go:334] "Generic (PLEG): container finished" podID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerID="a2a1a39f9ddca75cb82ba3c0f5dc242cf2634cdab0c48b7b6944e077ace1c8c7" exitCode=0 Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.372414 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwrz" event={"ID":"89089e3f-369b-4e92-b28b-369c2f3b3017","Type":"ContainerDied","Data":"a2a1a39f9ddca75cb82ba3c0f5dc242cf2634cdab0c48b7b6944e077ace1c8c7"} Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.390963 4672 scope.go:117] "RemoveContainer" containerID="d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.430742 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlv44\" (UniqueName: \"kubernetes.io/projected/3a02c3e3-f673-41d6-950c-0fef1950613e-kube-api-access-jlv44\") pod \"3a02c3e3-f673-41d6-950c-0fef1950613e\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.430849 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-utilities\") pod \"3a02c3e3-f673-41d6-950c-0fef1950613e\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.430882 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-catalog-content\") pod \"3a02c3e3-f673-41d6-950c-0fef1950613e\" (UID: \"3a02c3e3-f673-41d6-950c-0fef1950613e\") " Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.434106 4672 scope.go:117] "RemoveContainer" containerID="096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.435658 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-utilities" (OuterVolumeSpecName: "utilities") pod "3a02c3e3-f673-41d6-950c-0fef1950613e" (UID: "3a02c3e3-f673-41d6-950c-0fef1950613e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.455642 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a02c3e3-f673-41d6-950c-0fef1950613e-kube-api-access-jlv44" (OuterVolumeSpecName: "kube-api-access-jlv44") pod "3a02c3e3-f673-41d6-950c-0fef1950613e" (UID: "3a02c3e3-f673-41d6-950c-0fef1950613e"). InnerVolumeSpecName "kube-api-access-jlv44". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.480163 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.489063 4672 scope.go:117] "RemoveContainer" containerID="09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34" Sep 30 12:25:05 crc kubenswrapper[4672]: E0930 12:25:05.489483 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34\": container with ID starting with 09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34 not found: ID does not exist" containerID="09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.489546 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34"} err="failed to get container status \"09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34\": rpc error: code = NotFound desc = could not find container \"09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34\": container with ID starting with 09a896775300755cbffabfc2209395f0044cafaeb273f1740943e716ead14e34 not found: ID does not exist" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.489595 4672 scope.go:117] "RemoveContainer" containerID="d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f" Sep 30 12:25:05 crc kubenswrapper[4672]: E0930 12:25:05.490143 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f\": container with ID starting with d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f not found: ID does not exist" containerID="d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.490166 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f"} err="failed to get container status \"d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f\": rpc error: code = NotFound desc = could not find container \"d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f\": container with ID starting with d2d6bc1d083ccd3ac5c5813e4cca925bffe269517ce33ab4b6e6f8dd88f1585f not found: ID does not exist" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.490180 4672 scope.go:117] "RemoveContainer" containerID="096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9" Sep 30 12:25:05 crc kubenswrapper[4672]: E0930 12:25:05.490750 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9\": container with ID starting with 096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9 not found: ID does not exist" containerID="096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.490776 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9"} err="failed to get container status \"096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9\": rpc error: code = NotFound desc = could not find container \"096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9\": container with ID starting with 096f4764936cc71b20326f0150979ac3930ab4a53983e219f5532fa56793baf9 not found: ID does not exist" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.490783 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.506968 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a02c3e3-f673-41d6-950c-0fef1950613e" (UID: "3a02c3e3-f673-41d6-950c-0fef1950613e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.533744 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-catalog-content\") pod \"89089e3f-369b-4e92-b28b-369c2f3b3017\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.533803 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sn5r\" (UniqueName: \"kubernetes.io/projected/89089e3f-369b-4e92-b28b-369c2f3b3017-kube-api-access-6sn5r\") pod \"89089e3f-369b-4e92-b28b-369c2f3b3017\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.533846 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-utilities\") pod \"89089e3f-369b-4e92-b28b-369c2f3b3017\" (UID: \"89089e3f-369b-4e92-b28b-369c2f3b3017\") " Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.535021 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-utilities" (OuterVolumeSpecName: "utilities") pod "89089e3f-369b-4e92-b28b-369c2f3b3017" (UID: "89089e3f-369b-4e92-b28b-369c2f3b3017"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.535602 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlv44\" (UniqueName: \"kubernetes.io/projected/3a02c3e3-f673-41d6-950c-0fef1950613e-kube-api-access-jlv44\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.535628 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.535660 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.535675 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a02c3e3-f673-41d6-950c-0fef1950613e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.538117 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89089e3f-369b-4e92-b28b-369c2f3b3017-kube-api-access-6sn5r" (OuterVolumeSpecName: "kube-api-access-6sn5r") pod "89089e3f-369b-4e92-b28b-369c2f3b3017" (UID: "89089e3f-369b-4e92-b28b-369c2f3b3017"). InnerVolumeSpecName "kube-api-access-6sn5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.587681 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89089e3f-369b-4e92-b28b-369c2f3b3017" (UID: "89089e3f-369b-4e92-b28b-369c2f3b3017"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.637013 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89089e3f-369b-4e92-b28b-369c2f3b3017-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.637066 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sn5r\" (UniqueName: \"kubernetes.io/projected/89089e3f-369b-4e92-b28b-369c2f3b3017-kube-api-access-6sn5r\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:05 crc kubenswrapper[4672]: I0930 12:25:05.717078 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p4c"] Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.386634 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwrz" event={"ID":"89089e3f-369b-4e92-b28b-369c2f3b3017","Type":"ContainerDied","Data":"fe8c1560e4ddc803dd4045c417cdc733ee66700c9aeab238a136c9b58af1095f"} Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.386750 4672 scope.go:117] "RemoveContainer" containerID="a2a1a39f9ddca75cb82ba3c0f5dc242cf2634cdab0c48b7b6944e077ace1c8c7" Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.386755 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwrz" Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.387955 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfws4" Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.388119 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k9p4c" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="registry-server" containerID="cri-o://a3730022e647d5383f38df0f3c2b009534667be8ec815b36ea6a5a011c3f1348" gracePeriod=2 Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.409880 4672 scope.go:117] "RemoveContainer" containerID="e9ee9fefc1a32502e8c2796837715948ac2fbb5a5098456337d2d5b0e2767109" Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.434648 4672 scope.go:117] "RemoveContainer" containerID="4652b7b0e7e489a38e939d4fe0891a0cd714eae23ec04b3a0a01a87290b6ba22" Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.453074 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gfws4"] Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.458309 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gfws4"] Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.461511 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7nwrz"] Sep 30 12:25:06 crc kubenswrapper[4672]: I0930 12:25:06.464829 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7nwrz"] Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.400509 4672 generic.go:334] "Generic (PLEG): container finished" podID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerID="a3730022e647d5383f38df0f3c2b009534667be8ec815b36ea6a5a011c3f1348" exitCode=0 Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.400556 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p4c" event={"ID":"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e","Type":"ContainerDied","Data":"a3730022e647d5383f38df0f3c2b009534667be8ec815b36ea6a5a011c3f1348"} Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.424096 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" path="/var/lib/kubelet/pods/3a02c3e3-f673-41d6-950c-0fef1950613e/volumes" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.425221 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" path="/var/lib/kubelet/pods/89089e3f-369b-4e92-b28b-369c2f3b3017/volumes" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.800811 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.868795 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-utilities\") pod \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.868886 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-catalog-content\") pod \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.869051 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh8q5\" (UniqueName: \"kubernetes.io/projected/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-kube-api-access-wh8q5\") pod \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\" (UID: \"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e\") " Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.869736 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-utilities" (OuterVolumeSpecName: "utilities") pod "c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" (UID: "c0d02c8d-95c7-4bd2-9955-f633cfea5e4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.874890 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-kube-api-access-wh8q5" (OuterVolumeSpecName: "kube-api-access-wh8q5") pod "c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" (UID: "c0d02c8d-95c7-4bd2-9955-f633cfea5e4e"). InnerVolumeSpecName "kube-api-access-wh8q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.883493 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" (UID: "c0d02c8d-95c7-4bd2-9955-f633cfea5e4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.970720 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh8q5\" (UniqueName: \"kubernetes.io/projected/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-kube-api-access-wh8q5\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.970763 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:07 crc kubenswrapper[4672]: I0930 12:25:07.970777 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.311467 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkwtk"] Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.311695 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vkwtk" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="registry-server" containerID="cri-o://b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220" gracePeriod=2 Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.409237 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p4c" event={"ID":"c0d02c8d-95c7-4bd2-9955-f633cfea5e4e","Type":"ContainerDied","Data":"b8a070215a388ccac83c2fc26a0ea1a6a23747cc807ee8c47857f57cbe72e03d"} Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.409318 4672 scope.go:117] "RemoveContainer" containerID="a3730022e647d5383f38df0f3c2b009534667be8ec815b36ea6a5a011c3f1348" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.409376 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p4c" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.426103 4672 scope.go:117] "RemoveContainer" containerID="a1eac219c05a04a3b511b71abb77b2ab3d18f9ca832debbea421bdc52b7bc4b4" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.446203 4672 scope.go:117] "RemoveContainer" containerID="f4814398c6767c0b883350f9e9eb226142f248c0763b4c6579ef8f4c327350fe" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.449662 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p4c"] Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.456550 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p4c"] Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.943308 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.988862 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qghl8\" (UniqueName: \"kubernetes.io/projected/6cff532e-6e66-4d8f-9e9a-44a47aabc381-kube-api-access-qghl8\") pod \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.988924 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-catalog-content\") pod \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.988955 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-utilities\") pod \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\" (UID: \"6cff532e-6e66-4d8f-9e9a-44a47aabc381\") " Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.989864 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-utilities" (OuterVolumeSpecName: "utilities") pod "6cff532e-6e66-4d8f-9e9a-44a47aabc381" (UID: "6cff532e-6e66-4d8f-9e9a-44a47aabc381"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.990113 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:08 crc kubenswrapper[4672]: I0930 12:25:08.994337 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cff532e-6e66-4d8f-9e9a-44a47aabc381-kube-api-access-qghl8" (OuterVolumeSpecName: "kube-api-access-qghl8") pod "6cff532e-6e66-4d8f-9e9a-44a47aabc381" (UID: "6cff532e-6e66-4d8f-9e9a-44a47aabc381"). InnerVolumeSpecName "kube-api-access-qghl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.070573 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cff532e-6e66-4d8f-9e9a-44a47aabc381" (UID: "6cff532e-6e66-4d8f-9e9a-44a47aabc381"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.091533 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qghl8\" (UniqueName: \"kubernetes.io/projected/6cff532e-6e66-4d8f-9e9a-44a47aabc381-kube-api-access-qghl8\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.091558 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cff532e-6e66-4d8f-9e9a-44a47aabc381-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.418690 4672 generic.go:334] "Generic (PLEG): container finished" podID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerID="b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220" exitCode=0 Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.419208 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkwtk" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.425185 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" path="/var/lib/kubelet/pods/c0d02c8d-95c7-4bd2-9955-f633cfea5e4e/volumes" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.426227 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkwtk" event={"ID":"6cff532e-6e66-4d8f-9e9a-44a47aabc381","Type":"ContainerDied","Data":"b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220"} Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.426278 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkwtk" event={"ID":"6cff532e-6e66-4d8f-9e9a-44a47aabc381","Type":"ContainerDied","Data":"9663019dbf5957b590254956eae638165384f2b420358ea2ec0a4a2ad7c320e8"} Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.426307 4672 scope.go:117] "RemoveContainer" containerID="b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.456650 4672 scope.go:117] "RemoveContainer" containerID="c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.472382 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkwtk"] Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.476158 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vkwtk"] Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.477194 4672 scope.go:117] "RemoveContainer" containerID="ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.500127 4672 scope.go:117] "RemoveContainer" containerID="b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220" Sep 30 12:25:09 crc kubenswrapper[4672]: E0930 12:25:09.501001 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220\": container with ID starting with b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220 not found: ID does not exist" containerID="b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.501400 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220"} err="failed to get container status \"b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220\": rpc error: code = NotFound desc = could not find container \"b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220\": container with ID starting with b66ffb3aabd52c0ae4f6209508b4494d36cf8c60fcc9b0f158010eda3f69d220 not found: ID does not exist" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.501438 4672 scope.go:117] "RemoveContainer" containerID="c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5" Sep 30 12:25:09 crc kubenswrapper[4672]: E0930 12:25:09.501915 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5\": container with ID starting with c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5 not found: ID does not exist" containerID="c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.501959 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5"} err="failed to get container status \"c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5\": rpc error: code = NotFound desc = could not find container \"c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5\": container with ID starting with c2f0b1e2941b2335a0a600e187dd147b16acab0a9db8873695cef8d6c5768aa5 not found: ID does not exist" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.501991 4672 scope.go:117] "RemoveContainer" containerID="ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403" Sep 30 12:25:09 crc kubenswrapper[4672]: E0930 12:25:09.502376 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403\": container with ID starting with ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403 not found: ID does not exist" containerID="ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403" Sep 30 12:25:09 crc kubenswrapper[4672]: I0930 12:25:09.502409 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403"} err="failed to get container status \"ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403\": rpc error: code = NotFound desc = could not find container \"ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403\": container with ID starting with ec0b5c8069179ded77ab10cf8ff2a5441f327df1cd6d68733fa203d24268e403 not found: ID does not exist" Sep 30 12:25:11 crc kubenswrapper[4672]: I0930 12:25:11.422940 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" path="/var/lib/kubelet/pods/6cff532e-6e66-4d8f-9e9a-44a47aabc381/volumes" Sep 30 12:25:11 crc kubenswrapper[4672]: I0930 12:25:11.431701 4672 generic.go:334] "Generic (PLEG): container finished" podID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerID="23db0ec014fcb558600841de2f8a7f8851589908811568d7f92ac33a3364129c" exitCode=0 Sep 30 12:25:11 crc kubenswrapper[4672]: I0930 12:25:11.431728 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52587" event={"ID":"42146b78-abcf-44bd-8a8d-6048d9fa0a01","Type":"ContainerDied","Data":"23db0ec014fcb558600841de2f8a7f8851589908811568d7f92ac33a3364129c"} Sep 30 12:25:12 crc kubenswrapper[4672]: I0930 12:25:12.440452 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52587" event={"ID":"42146b78-abcf-44bd-8a8d-6048d9fa0a01","Type":"ContainerStarted","Data":"c7a0d3e7ea8ef4c0a9d823f655f516685cacec8b3344892b4fb898760d48f6db"} Sep 30 12:25:12 crc kubenswrapper[4672]: I0930 12:25:12.463755 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-52587" podStartSLOduration=2.283477227 podStartE2EDuration="52.463736885s" podCreationTimestamp="2025-09-30 12:24:20 +0000 UTC" firstStartedPulling="2025-09-30 12:24:21.726194642 +0000 UTC m=+152.995432298" lastFinishedPulling="2025-09-30 12:25:11.90645431 +0000 UTC m=+203.175691956" observedRunningTime="2025-09-30 12:25:12.459687055 +0000 UTC m=+203.728924701" watchObservedRunningTime="2025-09-30 12:25:12.463736885 +0000 UTC m=+203.732974541" Sep 30 12:25:20 crc kubenswrapper[4672]: I0930 12:25:20.845471 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-52587" Sep 30 12:25:20 crc kubenswrapper[4672]: I0930 12:25:20.846024 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-52587" Sep 30 12:25:20 crc kubenswrapper[4672]: I0930 12:25:20.889235 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-52587" Sep 30 12:25:21 crc kubenswrapper[4672]: I0930 12:25:21.530038 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-52587" Sep 30 12:25:24 crc kubenswrapper[4672]: I0930 12:25:24.739823 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:25:24 crc kubenswrapper[4672]: I0930 12:25:24.740204 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:25:24 crc kubenswrapper[4672]: I0930 12:25:24.740256 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:25:24 crc kubenswrapper[4672]: I0930 12:25:24.741070 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:25:24 crc kubenswrapper[4672]: I0930 12:25:24.741132 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76" gracePeriod=600 Sep 30 12:25:25 crc kubenswrapper[4672]: I0930 12:25:25.517673 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76" exitCode=0 Sep 30 12:25:25 crc kubenswrapper[4672]: I0930 12:25:25.517813 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76"} Sep 30 12:25:25 crc kubenswrapper[4672]: I0930 12:25:25.518438 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"99d96054a65bb2e9be687c5412fe9d8ef3b73a9fb641b821f2a35007e1e4415e"} Sep 30 12:25:32 crc kubenswrapper[4672]: I0930 12:25:32.278403 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vs8kw"] Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.325970 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" podUID="551a978e-b9ac-46c6-ad40-b2ed5b6121da" containerName="oauth-openshift" containerID="cri-o://53613a217d27e173b41b9b6f8cabf79cce9bfafd2cc0edc75a59c34679ab1350" gracePeriod=15 Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.721914 4672 generic.go:334] "Generic (PLEG): container finished" podID="551a978e-b9ac-46c6-ad40-b2ed5b6121da" containerID="53613a217d27e173b41b9b6f8cabf79cce9bfafd2cc0edc75a59c34679ab1350" exitCode=0 Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.722034 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" event={"ID":"551a978e-b9ac-46c6-ad40-b2ed5b6121da","Type":"ContainerDied","Data":"53613a217d27e173b41b9b6f8cabf79cce9bfafd2cc0edc75a59c34679ab1350"} Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.722305 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" event={"ID":"551a978e-b9ac-46c6-ad40-b2ed5b6121da","Type":"ContainerDied","Data":"df3033a182f759f6b82d7597a5370ef0747ee54a237c7c12fdddacce281e2d33"} Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.722329 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df3033a182f759f6b82d7597a5370ef0747ee54a237c7c12fdddacce281e2d33" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.752223 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.788634 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5795c8b5fb-fb68t"] Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.789093 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.789199 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.789318 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaae3fd-43c6-47fe-9374-7b42028645f9" containerName="pruner" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.789399 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaae3fd-43c6-47fe-9374-7b42028645f9" containerName="pruner" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.789465 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.789542 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.789612 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.789690 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.789769 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2fa305-9e65-4aee-9842-fe3274aa306f" containerName="pruner" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.789840 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2fa305-9e65-4aee-9842-fe3274aa306f" containerName="pruner" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.789935 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.790016 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.790088 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551a978e-b9ac-46c6-ad40-b2ed5b6121da" containerName="oauth-openshift" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.790161 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="551a978e-b9ac-46c6-ad40-b2ed5b6121da" containerName="oauth-openshift" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.790247 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.790349 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.790421 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.790484 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.790558 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.790649 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.790742 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.790816 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.790884 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.790957 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.791022 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.791082 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="extract-content" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.791218 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.791368 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="extract-utilities" Sep 30 12:25:57 crc kubenswrapper[4672]: E0930 12:25:57.791497 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.791642 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.791950 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eaae3fd-43c6-47fe-9374-7b42028645f9" containerName="pruner" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.792093 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="551a978e-b9ac-46c6-ad40-b2ed5b6121da" containerName="oauth-openshift" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.792201 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cff532e-6e66-4d8f-9e9a-44a47aabc381" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.792411 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a02c3e3-f673-41d6-950c-0fef1950613e" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.792562 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2fa305-9e65-4aee-9842-fe3274aa306f" containerName="pruner" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.792683 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d02c8d-95c7-4bd2-9955-f633cfea5e4e" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.792809 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="89089e3f-369b-4e92-b28b-369c2f3b3017" containerName="registry-server" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.793745 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.802054 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5795c8b5fb-fb68t"] Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.901623 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbpj8\" (UniqueName: \"kubernetes.io/projected/551a978e-b9ac-46c6-ad40-b2ed5b6121da-kube-api-access-rbpj8\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.901690 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-cliconfig\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902392 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-trusted-ca-bundle\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902488 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-router-certs\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902528 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-policies\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902554 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-ocp-branding-template\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902603 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-idp-0-file-data\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902656 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-service-ca\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902710 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-session\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902810 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-login\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902860 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-provider-selection\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902930 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-error\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.902974 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-dir\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903011 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-serving-cert\") pod \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\" (UID: \"551a978e-b9ac-46c6-ad40-b2ed5b6121da\") " Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903117 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903300 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-login\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903357 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903376 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903387 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-router-certs\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903431 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903435 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903505 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903523 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903691 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-error\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903757 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903823 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903884 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-audit-policies\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903925 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.903963 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904156 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-service-ca\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904249 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lsg\" (UniqueName: \"kubernetes.io/projected/77878773-072e-4aa9-a9af-45aa9a984678-kube-api-access-72lsg\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904339 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-session\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904394 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77878773-072e-4aa9-a9af-45aa9a984678-audit-dir\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904522 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904543 4672 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904554 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904565 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.904575 4672 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/551a978e-b9ac-46c6-ad40-b2ed5b6121da-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.908145 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.908367 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.908433 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.908454 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551a978e-b9ac-46c6-ad40-b2ed5b6121da-kube-api-access-rbpj8" (OuterVolumeSpecName: "kube-api-access-rbpj8") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "kube-api-access-rbpj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.908518 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.908984 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.909172 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.909766 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:57 crc kubenswrapper[4672]: I0930 12:25:57.914373 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "551a978e-b9ac-46c6-ad40-b2ed5b6121da" (UID: "551a978e-b9ac-46c6-ad40-b2ed5b6121da"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.005834 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.005925 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006016 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-service-ca\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006071 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-session\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006113 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72lsg\" (UniqueName: \"kubernetes.io/projected/77878773-072e-4aa9-a9af-45aa9a984678-kube-api-access-72lsg\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006165 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77878773-072e-4aa9-a9af-45aa9a984678-audit-dir\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006253 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-login\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006341 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-router-certs\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006387 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006443 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006524 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-error\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006574 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006626 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006681 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-audit-policies\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006774 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.006811 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007030 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007058 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007088 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007115 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007143 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007170 4672 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/551a978e-b9ac-46c6-ad40-b2ed5b6121da-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007196 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbpj8\" (UniqueName: \"kubernetes.io/projected/551a978e-b9ac-46c6-ad40-b2ed5b6121da-kube-api-access-rbpj8\") on node \"crc\" DevicePath \"\"" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.007919 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-service-ca\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.008604 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-audit-policies\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.009462 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77878773-072e-4aa9-a9af-45aa9a984678-audit-dir\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.010611 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.010882 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.011490 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.012922 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-error\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.014082 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-router-certs\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.014519 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.015042 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.016689 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-user-template-login\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.018858 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.019336 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/77878773-072e-4aa9-a9af-45aa9a984678-v4-0-config-system-session\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.038382 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lsg\" (UniqueName: \"kubernetes.io/projected/77878773-072e-4aa9-a9af-45aa9a984678-kube-api-access-72lsg\") pod \"oauth-openshift-5795c8b5fb-fb68t\" (UID: \"77878773-072e-4aa9-a9af-45aa9a984678\") " pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.110313 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.328881 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5795c8b5fb-fb68t"] Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.729298 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vs8kw" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.729293 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" event={"ID":"77878773-072e-4aa9-a9af-45aa9a984678","Type":"ContainerStarted","Data":"c85eecf5b6c06858259b6317f4a26f9d4e2426b29242fe75b3c14edcd6c9a18f"} Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.729857 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.729890 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" event={"ID":"77878773-072e-4aa9-a9af-45aa9a984678","Type":"ContainerStarted","Data":"f4347d9d6033064d38fe25dd79c50a4ec948b753d5057a5d011e23aa78f66bde"} Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.761792 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" podStartSLOduration=26.761770869 podStartE2EDuration="26.761770869s" podCreationTimestamp="2025-09-30 12:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:25:58.758312946 +0000 UTC m=+250.027550582" watchObservedRunningTime="2025-09-30 12:25:58.761770869 +0000 UTC m=+250.031008515" Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.777659 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vs8kw"] Sep 30 12:25:58 crc kubenswrapper[4672]: I0930 12:25:58.783613 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vs8kw"] Sep 30 12:25:59 crc kubenswrapper[4672]: I0930 12:25:59.175076 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5795c8b5fb-fb68t" Sep 30 12:25:59 crc kubenswrapper[4672]: I0930 12:25:59.423910 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551a978e-b9ac-46c6-ad40-b2ed5b6121da" path="/var/lib/kubelet/pods/551a978e-b9ac-46c6-ad40-b2ed5b6121da/volumes" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.692608 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hflds"] Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.694882 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hflds" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="registry-server" containerID="cri-o://acaf7ec96bfc15124aea7271441724d63e065e183d4557d1ca8ba5e828e1eb55" gracePeriod=30 Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.698055 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52587"] Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.698321 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-52587" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="registry-server" containerID="cri-o://c7a0d3e7ea8ef4c0a9d823f655f516685cacec8b3344892b4fb898760d48f6db" gracePeriod=30 Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.712810 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svmxm"] Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.713106 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" podUID="f4d96698-0412-4923-9a10-03b174e0ca6c" containerName="marketplace-operator" containerID="cri-o://23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1" gracePeriod=30 Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.735486 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvbdw"] Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.735773 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gvbdw" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="registry-server" containerID="cri-o://2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854" gracePeriod=30 Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.742230 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxjq6"] Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.743394 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.745173 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5h54f"] Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.745591 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5h54f" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="registry-server" containerID="cri-o://a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d" gracePeriod=30 Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.750171 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxjq6"] Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.835283 4672 generic.go:334] "Generic (PLEG): container finished" podID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerID="acaf7ec96bfc15124aea7271441724d63e065e183d4557d1ca8ba5e828e1eb55" exitCode=0 Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.835390 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hflds" event={"ID":"fc47647e-d94f-4cca-8e3f-50cab43126ab","Type":"ContainerDied","Data":"acaf7ec96bfc15124aea7271441724d63e065e183d4557d1ca8ba5e828e1eb55"} Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.838577 4672 generic.go:334] "Generic (PLEG): container finished" podID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerID="c7a0d3e7ea8ef4c0a9d823f655f516685cacec8b3344892b4fb898760d48f6db" exitCode=0 Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.838605 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52587" event={"ID":"42146b78-abcf-44bd-8a8d-6048d9fa0a01","Type":"ContainerDied","Data":"c7a0d3e7ea8ef4c0a9d823f655f516685cacec8b3344892b4fb898760d48f6db"} Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.856517 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wglkd\" (UniqueName: \"kubernetes.io/projected/08b96597-cb4d-4c38-9557-d60b937ab2c7-kube-api-access-wglkd\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.856570 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08b96597-cb4d-4c38-9557-d60b937ab2c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.856607 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/08b96597-cb4d-4c38-9557-d60b937ab2c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.961052 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/08b96597-cb4d-4c38-9557-d60b937ab2c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.961148 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wglkd\" (UniqueName: \"kubernetes.io/projected/08b96597-cb4d-4c38-9557-d60b937ab2c7-kube-api-access-wglkd\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.961201 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08b96597-cb4d-4c38-9557-d60b937ab2c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.962815 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08b96597-cb4d-4c38-9557-d60b937ab2c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.969717 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/08b96597-cb4d-4c38-9557-d60b937ab2c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:14 crc kubenswrapper[4672]: I0930 12:26:14.988341 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wglkd\" (UniqueName: \"kubernetes.io/projected/08b96597-cb4d-4c38-9557-d60b937ab2c7-kube-api-access-wglkd\") pod \"marketplace-operator-79b997595-kxjq6\" (UID: \"08b96597-cb4d-4c38-9557-d60b937ab2c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.142724 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.151052 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52587" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.154538 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.164584 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.179467 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.185577 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268183 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-trusted-ca\") pod \"f4d96698-0412-4923-9a10-03b174e0ca6c\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268222 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c4mf\" (UniqueName: \"kubernetes.io/projected/f4d96698-0412-4923-9a10-03b174e0ca6c-kube-api-access-9c4mf\") pod \"f4d96698-0412-4923-9a10-03b174e0ca6c\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268246 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96cvg\" (UniqueName: \"kubernetes.io/projected/fc47647e-d94f-4cca-8e3f-50cab43126ab-kube-api-access-96cvg\") pod \"fc47647e-d94f-4cca-8e3f-50cab43126ab\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268315 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-utilities\") pod \"fc47647e-d94f-4cca-8e3f-50cab43126ab\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268347 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-operator-metrics\") pod \"f4d96698-0412-4923-9a10-03b174e0ca6c\" (UID: \"f4d96698-0412-4923-9a10-03b174e0ca6c\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268364 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-utilities\") pod \"3588660c-1eb4-4173-87c4-7bf4f6200851\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268380 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-utilities\") pod \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268397 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-utilities\") pod \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268422 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-catalog-content\") pod \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268461 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ccjh\" (UniqueName: \"kubernetes.io/projected/42146b78-abcf-44bd-8a8d-6048d9fa0a01-kube-api-access-2ccjh\") pod \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268477 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72pwh\" (UniqueName: \"kubernetes.io/projected/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-kube-api-access-72pwh\") pod \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\" (UID: \"ac79497a-7f46-4ffd-9a7f-40153ab89a6d\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268698 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-catalog-content\") pod \"fc47647e-d94f-4cca-8e3f-50cab43126ab\" (UID: \"fc47647e-d94f-4cca-8e3f-50cab43126ab\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268733 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mst6\" (UniqueName: \"kubernetes.io/projected/3588660c-1eb4-4173-87c4-7bf4f6200851-kube-api-access-8mst6\") pod \"3588660c-1eb4-4173-87c4-7bf4f6200851\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268763 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-catalog-content\") pod \"3588660c-1eb4-4173-87c4-7bf4f6200851\" (UID: \"3588660c-1eb4-4173-87c4-7bf4f6200851\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.268785 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-catalog-content\") pod \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\" (UID: \"42146b78-abcf-44bd-8a8d-6048d9fa0a01\") " Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.270738 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f4d96698-0412-4923-9a10-03b174e0ca6c" (UID: "f4d96698-0412-4923-9a10-03b174e0ca6c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.276077 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-utilities" (OuterVolumeSpecName: "utilities") pod "42146b78-abcf-44bd-8a8d-6048d9fa0a01" (UID: "42146b78-abcf-44bd-8a8d-6048d9fa0a01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.276379 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-utilities" (OuterVolumeSpecName: "utilities") pod "3588660c-1eb4-4173-87c4-7bf4f6200851" (UID: "3588660c-1eb4-4173-87c4-7bf4f6200851"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.277442 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-utilities" (OuterVolumeSpecName: "utilities") pod "ac79497a-7f46-4ffd-9a7f-40153ab89a6d" (UID: "ac79497a-7f46-4ffd-9a7f-40153ab89a6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.278697 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d96698-0412-4923-9a10-03b174e0ca6c-kube-api-access-9c4mf" (OuterVolumeSpecName: "kube-api-access-9c4mf") pod "f4d96698-0412-4923-9a10-03b174e0ca6c" (UID: "f4d96698-0412-4923-9a10-03b174e0ca6c"). InnerVolumeSpecName "kube-api-access-9c4mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.280240 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-utilities" (OuterVolumeSpecName: "utilities") pod "fc47647e-d94f-4cca-8e3f-50cab43126ab" (UID: "fc47647e-d94f-4cca-8e3f-50cab43126ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.280387 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc47647e-d94f-4cca-8e3f-50cab43126ab-kube-api-access-96cvg" (OuterVolumeSpecName: "kube-api-access-96cvg") pod "fc47647e-d94f-4cca-8e3f-50cab43126ab" (UID: "fc47647e-d94f-4cca-8e3f-50cab43126ab"). InnerVolumeSpecName "kube-api-access-96cvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.280513 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f4d96698-0412-4923-9a10-03b174e0ca6c" (UID: "f4d96698-0412-4923-9a10-03b174e0ca6c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.299348 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42146b78-abcf-44bd-8a8d-6048d9fa0a01-kube-api-access-2ccjh" (OuterVolumeSpecName: "kube-api-access-2ccjh") pod "42146b78-abcf-44bd-8a8d-6048d9fa0a01" (UID: "42146b78-abcf-44bd-8a8d-6048d9fa0a01"). InnerVolumeSpecName "kube-api-access-2ccjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.303365 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-kube-api-access-72pwh" (OuterVolumeSpecName: "kube-api-access-72pwh") pod "ac79497a-7f46-4ffd-9a7f-40153ab89a6d" (UID: "ac79497a-7f46-4ffd-9a7f-40153ab89a6d"). InnerVolumeSpecName "kube-api-access-72pwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.304796 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac79497a-7f46-4ffd-9a7f-40153ab89a6d" (UID: "ac79497a-7f46-4ffd-9a7f-40153ab89a6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.318763 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3588660c-1eb4-4173-87c4-7bf4f6200851-kube-api-access-8mst6" (OuterVolumeSpecName: "kube-api-access-8mst6") pod "3588660c-1eb4-4173-87c4-7bf4f6200851" (UID: "3588660c-1eb4-4173-87c4-7bf4f6200851"). InnerVolumeSpecName "kube-api-access-8mst6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.344557 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42146b78-abcf-44bd-8a8d-6048d9fa0a01" (UID: "42146b78-abcf-44bd-8a8d-6048d9fa0a01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.359198 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc47647e-d94f-4cca-8e3f-50cab43126ab" (UID: "fc47647e-d94f-4cca-8e3f-50cab43126ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371637 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mst6\" (UniqueName: \"kubernetes.io/projected/3588660c-1eb4-4173-87c4-7bf4f6200851-kube-api-access-8mst6\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371679 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371694 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371710 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c4mf\" (UniqueName: \"kubernetes.io/projected/f4d96698-0412-4923-9a10-03b174e0ca6c-kube-api-access-9c4mf\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371724 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96cvg\" (UniqueName: \"kubernetes.io/projected/fc47647e-d94f-4cca-8e3f-50cab43126ab-kube-api-access-96cvg\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371741 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371755 4672 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d96698-0412-4923-9a10-03b174e0ca6c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371770 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371783 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371798 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42146b78-abcf-44bd-8a8d-6048d9fa0a01-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371810 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371823 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ccjh\" (UniqueName: \"kubernetes.io/projected/42146b78-abcf-44bd-8a8d-6048d9fa0a01-kube-api-access-2ccjh\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371839 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72pwh\" (UniqueName: \"kubernetes.io/projected/ac79497a-7f46-4ffd-9a7f-40153ab89a6d-kube-api-access-72pwh\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.371851 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc47647e-d94f-4cca-8e3f-50cab43126ab-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.381964 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3588660c-1eb4-4173-87c4-7bf4f6200851" (UID: "3588660c-1eb4-4173-87c4-7bf4f6200851"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.415539 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxjq6"] Sep 30 12:26:15 crc kubenswrapper[4672]: W0930 12:26:15.419973 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08b96597_cb4d_4c38_9557_d60b937ab2c7.slice/crio-abdf6bc3e73935397669ddc1be41a4785604a10b651a95d15cdc6a6b8b507cdb WatchSource:0}: Error finding container abdf6bc3e73935397669ddc1be41a4785604a10b651a95d15cdc6a6b8b507cdb: Status 404 returned error can't find the container with id abdf6bc3e73935397669ddc1be41a4785604a10b651a95d15cdc6a6b8b507cdb Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.473018 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3588660c-1eb4-4173-87c4-7bf4f6200851-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.849501 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5h54f" event={"ID":"3588660c-1eb4-4173-87c4-7bf4f6200851","Type":"ContainerDied","Data":"a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.849967 4672 scope.go:117] "RemoveContainer" containerID="a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.849510 4672 generic.go:334] "Generic (PLEG): container finished" podID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerID="a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d" exitCode=0 Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.850099 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5h54f" event={"ID":"3588660c-1eb4-4173-87c4-7bf4f6200851","Type":"ContainerDied","Data":"8fa66ada2f9ef5befafd18480ec3a6754a119fcf8696c27e7038c7182669e9be"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.851851 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5h54f" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.855055 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hflds" event={"ID":"fc47647e-d94f-4cca-8e3f-50cab43126ab","Type":"ContainerDied","Data":"3fb96796f81a7efde267ccdec4b8040ca014854970537805925f7c4a8d9dab7e"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.855249 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hflds" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.860727 4672 generic.go:334] "Generic (PLEG): container finished" podID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerID="2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854" exitCode=0 Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.860831 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvbdw" event={"ID":"ac79497a-7f46-4ffd-9a7f-40153ab89a6d","Type":"ContainerDied","Data":"2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.861250 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvbdw" event={"ID":"ac79497a-7f46-4ffd-9a7f-40153ab89a6d","Type":"ContainerDied","Data":"c49a97d41fd5a70773794191be2d352bc23f3f359ab84b9c54e0b1de94813bc6"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.860909 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvbdw" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.864049 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" event={"ID":"08b96597-cb4d-4c38-9557-d60b937ab2c7","Type":"ContainerStarted","Data":"c0bcef0f0273bcef0ed3400ca3b56f76265b87569948821f54501b29feea42aa"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.864389 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" event={"ID":"08b96597-cb4d-4c38-9557-d60b937ab2c7","Type":"ContainerStarted","Data":"abdf6bc3e73935397669ddc1be41a4785604a10b651a95d15cdc6a6b8b507cdb"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.865931 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.867884 4672 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kxjq6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.867952 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" podUID="08b96597-cb4d-4c38-9557-d60b937ab2c7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.868425 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4d96698-0412-4923-9a10-03b174e0ca6c" containerID="23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1" exitCode=0 Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.868513 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" event={"ID":"f4d96698-0412-4923-9a10-03b174e0ca6c","Type":"ContainerDied","Data":"23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.868539 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" event={"ID":"f4d96698-0412-4923-9a10-03b174e0ca6c","Type":"ContainerDied","Data":"4dafabaa7c11c84639fe41e7e6b9bb2b6e20a0ad04aad39f51af8f0390c9e2e3"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.868689 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svmxm" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.871602 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52587" event={"ID":"42146b78-abcf-44bd-8a8d-6048d9fa0a01","Type":"ContainerDied","Data":"aa5ccf906ad5517fd4c28e14cd8a4d521c234a25f84d109a6c2e500164cb5a51"} Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.871687 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52587" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.895914 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" podStartSLOduration=1.8958889719999998 podStartE2EDuration="1.895888972s" podCreationTimestamp="2025-09-30 12:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:26:15.893830065 +0000 UTC m=+267.163067721" watchObservedRunningTime="2025-09-30 12:26:15.895888972 +0000 UTC m=+267.165126618" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.917645 4672 scope.go:117] "RemoveContainer" containerID="d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.927596 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svmxm"] Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.930479 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svmxm"] Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.937647 4672 scope.go:117] "RemoveContainer" containerID="7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.942337 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvbdw"] Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.951089 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvbdw"] Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.963678 4672 scope.go:117] "RemoveContainer" containerID="a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d" Sep 30 12:26:15 crc kubenswrapper[4672]: E0930 12:26:15.964723 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d\": container with ID starting with a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d not found: ID does not exist" containerID="a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.964775 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d"} err="failed to get container status \"a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d\": rpc error: code = NotFound desc = could not find container \"a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d\": container with ID starting with a159f542b6c6230950997e0f55d13f459b7455e5b98f78cbc2d5551ea9b5d50d not found: ID does not exist" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.964808 4672 scope.go:117] "RemoveContainer" containerID="d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c" Sep 30 12:26:15 crc kubenswrapper[4672]: E0930 12:26:15.965254 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c\": container with ID starting with d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c not found: ID does not exist" containerID="d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.965298 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c"} err="failed to get container status \"d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c\": rpc error: code = NotFound desc = could not find container \"d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c\": container with ID starting with d9653bcf18b4986302f9f2d7e14d0363da97d1fa5506f34222da20be0b61d60c not found: ID does not exist" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.965316 4672 scope.go:117] "RemoveContainer" containerID="7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.965464 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5h54f"] Sep 30 12:26:15 crc kubenswrapper[4672]: E0930 12:26:15.965786 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078\": container with ID starting with 7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078 not found: ID does not exist" containerID="7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.965816 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078"} err="failed to get container status \"7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078\": rpc error: code = NotFound desc = could not find container \"7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078\": container with ID starting with 7dbb3b313e6167a7f3bbe52643ddc3b40e3b4b0980dc0518b4f8b168781b6078 not found: ID does not exist" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.965832 4672 scope.go:117] "RemoveContainer" containerID="acaf7ec96bfc15124aea7271441724d63e065e183d4557d1ca8ba5e828e1eb55" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.974067 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5h54f"] Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.985497 4672 scope.go:117] "RemoveContainer" containerID="56d6f0db667bbd7a1303af5437346b22f735aec4c2a2e35e5a575fd931f1a058" Sep 30 12:26:15 crc kubenswrapper[4672]: I0930 12:26:15.990858 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hflds"] Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.000113 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hflds"] Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.004199 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52587"] Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.008352 4672 scope.go:117] "RemoveContainer" containerID="e353d583cff2f10befd47bf90c35fe72c2b61bcf5fac8d60711ac0cad0fa7479" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.009664 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-52587"] Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.026051 4672 scope.go:117] "RemoveContainer" containerID="2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.038513 4672 scope.go:117] "RemoveContainer" containerID="b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.059628 4672 scope.go:117] "RemoveContainer" containerID="78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.075805 4672 scope.go:117] "RemoveContainer" containerID="2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.076405 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854\": container with ID starting with 2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854 not found: ID does not exist" containerID="2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.076459 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854"} err="failed to get container status \"2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854\": rpc error: code = NotFound desc = could not find container \"2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854\": container with ID starting with 2d8b4b2113cd5069ca3ea4d269f1197590ca7af0bcf33cdb3e43724094bac854 not found: ID does not exist" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.076495 4672 scope.go:117] "RemoveContainer" containerID="b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.077032 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b\": container with ID starting with b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b not found: ID does not exist" containerID="b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.077099 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b"} err="failed to get container status \"b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b\": rpc error: code = NotFound desc = could not find container \"b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b\": container with ID starting with b4f28a2eade21afee05ad8fa191bf4f94c4fb46b684792f56e24a4380783156b not found: ID does not exist" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.077145 4672 scope.go:117] "RemoveContainer" containerID="78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.077582 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca\": container with ID starting with 78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca not found: ID does not exist" containerID="78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.077617 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca"} err="failed to get container status \"78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca\": rpc error: code = NotFound desc = could not find container \"78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca\": container with ID starting with 78a1732e943ded0724e4ec527e7aacf3df280d91967e2a4ffb63030c1d8002ca not found: ID does not exist" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.077638 4672 scope.go:117] "RemoveContainer" containerID="23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.107724 4672 scope.go:117] "RemoveContainer" containerID="23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.108609 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1\": container with ID starting with 23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1 not found: ID does not exist" containerID="23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.108677 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1"} err="failed to get container status \"23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1\": rpc error: code = NotFound desc = could not find container \"23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1\": container with ID starting with 23207ee4c031a8305a9e1413000eccf71dc3b87dec33539fdc9fb79c3b1d95f1 not found: ID does not exist" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.108720 4672 scope.go:117] "RemoveContainer" containerID="c7a0d3e7ea8ef4c0a9d823f655f516685cacec8b3344892b4fb898760d48f6db" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.129170 4672 scope.go:117] "RemoveContainer" containerID="23db0ec014fcb558600841de2f8a7f8851589908811568d7f92ac33a3364129c" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.154242 4672 scope.go:117] "RemoveContainer" containerID="ce5ef87226c77462c432873c138486a80f3c3a7a0878b22325ccb202712bf8ef" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.889324 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kxjq6" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.917577 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xpjr4"] Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.917890 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.917916 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.917936 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.917947 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.917961 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.917976 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.917994 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918005 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918019 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918030 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918042 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918055 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918074 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918086 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918100 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918111 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918129 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918140 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="extract-content" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918153 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918164 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="extract-utilities" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918182 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d96698-0412-4923-9a10-03b174e0ca6c" containerName="marketplace-operator" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918193 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d96698-0412-4923-9a10-03b174e0ca6c" containerName="marketplace-operator" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918206 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918217 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: E0930 12:26:16.918234 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918245 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918424 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918448 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918462 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d96698-0412-4923-9a10-03b174e0ca6c" containerName="marketplace-operator" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918476 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.918492 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" containerName="registry-server" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.919679 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.923634 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.930895 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpjr4"] Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.991941 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3da1a5-1d7f-4d33-9245-55038dd253d3-catalog-content\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.991983 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3da1a5-1d7f-4d33-9245-55038dd253d3-utilities\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:16 crc kubenswrapper[4672]: I0930 12:26:16.992024 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhlpw\" (UniqueName: \"kubernetes.io/projected/9a3da1a5-1d7f-4d33-9245-55038dd253d3-kube-api-access-fhlpw\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.092833 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhlpw\" (UniqueName: \"kubernetes.io/projected/9a3da1a5-1d7f-4d33-9245-55038dd253d3-kube-api-access-fhlpw\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.092927 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3da1a5-1d7f-4d33-9245-55038dd253d3-catalog-content\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.092944 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3da1a5-1d7f-4d33-9245-55038dd253d3-utilities\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.093340 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3da1a5-1d7f-4d33-9245-55038dd253d3-catalog-content\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.093413 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3da1a5-1d7f-4d33-9245-55038dd253d3-utilities\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.110253 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khgj2"] Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.111546 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.113677 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.116874 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhlpw\" (UniqueName: \"kubernetes.io/projected/9a3da1a5-1d7f-4d33-9245-55038dd253d3-kube-api-access-fhlpw\") pod \"redhat-marketplace-xpjr4\" (UID: \"9a3da1a5-1d7f-4d33-9245-55038dd253d3\") " pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.122812 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khgj2"] Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.193958 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-catalog-content\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.194052 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw7vb\" (UniqueName: \"kubernetes.io/projected/c706065f-5cf6-4719-96a3-ce442d33c58a-kube-api-access-qw7vb\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.194109 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-utilities\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.249909 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.295307 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw7vb\" (UniqueName: \"kubernetes.io/projected/c706065f-5cf6-4719-96a3-ce442d33c58a-kube-api-access-qw7vb\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.295383 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-utilities\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.295422 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-catalog-content\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.295903 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-catalog-content\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.296499 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-utilities\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.318616 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw7vb\" (UniqueName: \"kubernetes.io/projected/c706065f-5cf6-4719-96a3-ce442d33c58a-kube-api-access-qw7vb\") pod \"community-operators-khgj2\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.430131 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3588660c-1eb4-4173-87c4-7bf4f6200851" path="/var/lib/kubelet/pods/3588660c-1eb4-4173-87c4-7bf4f6200851/volumes" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.431309 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42146b78-abcf-44bd-8a8d-6048d9fa0a01" path="/var/lib/kubelet/pods/42146b78-abcf-44bd-8a8d-6048d9fa0a01/volumes" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.432381 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac79497a-7f46-4ffd-9a7f-40153ab89a6d" path="/var/lib/kubelet/pods/ac79497a-7f46-4ffd-9a7f-40153ab89a6d/volumes" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.433750 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d96698-0412-4923-9a10-03b174e0ca6c" path="/var/lib/kubelet/pods/f4d96698-0412-4923-9a10-03b174e0ca6c/volumes" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.434380 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc47647e-d94f-4cca-8e3f-50cab43126ab" path="/var/lib/kubelet/pods/fc47647e-d94f-4cca-8e3f-50cab43126ab/volumes" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.447253 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.474916 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpjr4"] Sep 30 12:26:17 crc kubenswrapper[4672]: W0930 12:26:17.480764 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3da1a5_1d7f_4d33_9245_55038dd253d3.slice/crio-d48afec7ba954f06f8079ddcac1ef3443b57fc01853cf2c5781324a65d2737ae WatchSource:0}: Error finding container d48afec7ba954f06f8079ddcac1ef3443b57fc01853cf2c5781324a65d2737ae: Status 404 returned error can't find the container with id d48afec7ba954f06f8079ddcac1ef3443b57fc01853cf2c5781324a65d2737ae Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.693931 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khgj2"] Sep 30 12:26:17 crc kubenswrapper[4672]: W0930 12:26:17.757536 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc706065f_5cf6_4719_96a3_ce442d33c58a.slice/crio-bc55e211b8e0c0d053448e20a52d30727543a69f007f5a5d747d7ba2f73a9e79 WatchSource:0}: Error finding container bc55e211b8e0c0d053448e20a52d30727543a69f007f5a5d747d7ba2f73a9e79: Status 404 returned error can't find the container with id bc55e211b8e0c0d053448e20a52d30727543a69f007f5a5d747d7ba2f73a9e79 Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.901182 4672 generic.go:334] "Generic (PLEG): container finished" podID="9a3da1a5-1d7f-4d33-9245-55038dd253d3" containerID="f93374e3f962b0f8c632ead6c4365a7346ff9f09db5b5cd11fa7ed6b73ffa92c" exitCode=0 Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.901237 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpjr4" event={"ID":"9a3da1a5-1d7f-4d33-9245-55038dd253d3","Type":"ContainerDied","Data":"f93374e3f962b0f8c632ead6c4365a7346ff9f09db5b5cd11fa7ed6b73ffa92c"} Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.901304 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpjr4" event={"ID":"9a3da1a5-1d7f-4d33-9245-55038dd253d3","Type":"ContainerStarted","Data":"d48afec7ba954f06f8079ddcac1ef3443b57fc01853cf2c5781324a65d2737ae"} Sep 30 12:26:17 crc kubenswrapper[4672]: I0930 12:26:17.903552 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khgj2" event={"ID":"c706065f-5cf6-4719-96a3-ce442d33c58a","Type":"ContainerStarted","Data":"bc55e211b8e0c0d053448e20a52d30727543a69f007f5a5d747d7ba2f73a9e79"} Sep 30 12:26:18 crc kubenswrapper[4672]: I0930 12:26:18.909969 4672 generic.go:334] "Generic (PLEG): container finished" podID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerID="ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8" exitCode=0 Sep 30 12:26:18 crc kubenswrapper[4672]: I0930 12:26:18.910045 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khgj2" event={"ID":"c706065f-5cf6-4719-96a3-ce442d33c58a","Type":"ContainerDied","Data":"ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8"} Sep 30 12:26:18 crc kubenswrapper[4672]: I0930 12:26:18.916792 4672 generic.go:334] "Generic (PLEG): container finished" podID="9a3da1a5-1d7f-4d33-9245-55038dd253d3" containerID="702a08d9942fe94fb0cc0615bc869a4daf7ea2103d28ce32c6474af70cc7c271" exitCode=0 Sep 30 12:26:18 crc kubenswrapper[4672]: I0930 12:26:18.916899 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpjr4" event={"ID":"9a3da1a5-1d7f-4d33-9245-55038dd253d3","Type":"ContainerDied","Data":"702a08d9942fe94fb0cc0615bc869a4daf7ea2103d28ce32c6474af70cc7c271"} Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.312037 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n27k6"] Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.313207 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.316516 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.324568 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n27k6"] Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.327605 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a213a95b-eb38-4932-9c67-11f1b91d0202-catalog-content\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.327728 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a213a95b-eb38-4932-9c67-11f1b91d0202-utilities\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.327859 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db2qg\" (UniqueName: \"kubernetes.io/projected/a213a95b-eb38-4932-9c67-11f1b91d0202-kube-api-access-db2qg\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.428497 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db2qg\" (UniqueName: \"kubernetes.io/projected/a213a95b-eb38-4932-9c67-11f1b91d0202-kube-api-access-db2qg\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.428872 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a213a95b-eb38-4932-9c67-11f1b91d0202-catalog-content\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.428930 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a213a95b-eb38-4932-9c67-11f1b91d0202-utilities\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.429322 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a213a95b-eb38-4932-9c67-11f1b91d0202-catalog-content\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.429348 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a213a95b-eb38-4932-9c67-11f1b91d0202-utilities\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.449828 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db2qg\" (UniqueName: \"kubernetes.io/projected/a213a95b-eb38-4932-9c67-11f1b91d0202-kube-api-access-db2qg\") pod \"certified-operators-n27k6\" (UID: \"a213a95b-eb38-4932-9c67-11f1b91d0202\") " pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.512434 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lkhn2"] Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.513597 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.516010 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.525430 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkhn2"] Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.530605 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llm47\" (UniqueName: \"kubernetes.io/projected/f337a53e-90b5-44a2-a033-bf26d3498158-kube-api-access-llm47\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.530679 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f337a53e-90b5-44a2-a033-bf26d3498158-utilities\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.530750 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f337a53e-90b5-44a2-a033-bf26d3498158-catalog-content\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.631582 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.631739 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llm47\" (UniqueName: \"kubernetes.io/projected/f337a53e-90b5-44a2-a033-bf26d3498158-kube-api-access-llm47\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.631782 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f337a53e-90b5-44a2-a033-bf26d3498158-utilities\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.631825 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f337a53e-90b5-44a2-a033-bf26d3498158-catalog-content\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.632225 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f337a53e-90b5-44a2-a033-bf26d3498158-catalog-content\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.632314 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f337a53e-90b5-44a2-a033-bf26d3498158-utilities\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.648895 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llm47\" (UniqueName: \"kubernetes.io/projected/f337a53e-90b5-44a2-a033-bf26d3498158-kube-api-access-llm47\") pod \"redhat-operators-lkhn2\" (UID: \"f337a53e-90b5-44a2-a033-bf26d3498158\") " pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.839345 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.845455 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n27k6"] Sep 30 12:26:19 crc kubenswrapper[4672]: W0930 12:26:19.856168 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda213a95b_eb38_4932_9c67_11f1b91d0202.slice/crio-2eb904da02d36e9377d196557531bfd0a090b7c85c41a50eaa8875f2e1281bff WatchSource:0}: Error finding container 2eb904da02d36e9377d196557531bfd0a090b7c85c41a50eaa8875f2e1281bff: Status 404 returned error can't find the container with id 2eb904da02d36e9377d196557531bfd0a090b7c85c41a50eaa8875f2e1281bff Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.933835 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khgj2" event={"ID":"c706065f-5cf6-4719-96a3-ce442d33c58a","Type":"ContainerStarted","Data":"210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca"} Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.940453 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n27k6" event={"ID":"a213a95b-eb38-4932-9c67-11f1b91d0202","Type":"ContainerStarted","Data":"2eb904da02d36e9377d196557531bfd0a090b7c85c41a50eaa8875f2e1281bff"} Sep 30 12:26:19 crc kubenswrapper[4672]: I0930 12:26:19.956242 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpjr4" event={"ID":"9a3da1a5-1d7f-4d33-9245-55038dd253d3","Type":"ContainerStarted","Data":"962bc8165f6481dd7d1d4d711edc66ef5dc62adf445475a59ce2bb63bbc263a0"} Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.057332 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xpjr4" podStartSLOduration=2.5829653500000003 podStartE2EDuration="4.057310817s" podCreationTimestamp="2025-09-30 12:26:16 +0000 UTC" firstStartedPulling="2025-09-30 12:26:17.903181933 +0000 UTC m=+269.172419579" lastFinishedPulling="2025-09-30 12:26:19.37752739 +0000 UTC m=+270.646765046" observedRunningTime="2025-09-30 12:26:19.981231051 +0000 UTC m=+271.250468697" watchObservedRunningTime="2025-09-30 12:26:20.057310817 +0000 UTC m=+271.326548463" Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.058551 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkhn2"] Sep 30 12:26:20 crc kubenswrapper[4672]: W0930 12:26:20.067445 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf337a53e_90b5_44a2_a033_bf26d3498158.slice/crio-f8c34b79b36abfdc05464e701ffa240c0f3ec4cfac57192b3ac73bd022610df1 WatchSource:0}: Error finding container f8c34b79b36abfdc05464e701ffa240c0f3ec4cfac57192b3ac73bd022610df1: Status 404 returned error can't find the container with id f8c34b79b36abfdc05464e701ffa240c0f3ec4cfac57192b3ac73bd022610df1 Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.963516 4672 generic.go:334] "Generic (PLEG): container finished" podID="a213a95b-eb38-4932-9c67-11f1b91d0202" containerID="cc3a72217cf784b87a0ba14f2a312a7834388deb3dba9ab4da860caf6aa980ed" exitCode=0 Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.963617 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n27k6" event={"ID":"a213a95b-eb38-4932-9c67-11f1b91d0202","Type":"ContainerDied","Data":"cc3a72217cf784b87a0ba14f2a312a7834388deb3dba9ab4da860caf6aa980ed"} Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.966442 4672 generic.go:334] "Generic (PLEG): container finished" podID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerID="210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca" exitCode=0 Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.966497 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khgj2" event={"ID":"c706065f-5cf6-4719-96a3-ce442d33c58a","Type":"ContainerDied","Data":"210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca"} Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.968699 4672 generic.go:334] "Generic (PLEG): container finished" podID="f337a53e-90b5-44a2-a033-bf26d3498158" containerID="4e32ff0a4f42581f5189aef789ba4f5cbf3be0017baff338ce3703afa13d2286" exitCode=0 Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.968814 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkhn2" event={"ID":"f337a53e-90b5-44a2-a033-bf26d3498158","Type":"ContainerDied","Data":"4e32ff0a4f42581f5189aef789ba4f5cbf3be0017baff338ce3703afa13d2286"} Sep 30 12:26:20 crc kubenswrapper[4672]: I0930 12:26:20.968839 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkhn2" event={"ID":"f337a53e-90b5-44a2-a033-bf26d3498158","Type":"ContainerStarted","Data":"f8c34b79b36abfdc05464e701ffa240c0f3ec4cfac57192b3ac73bd022610df1"} Sep 30 12:26:21 crc kubenswrapper[4672]: I0930 12:26:21.976155 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khgj2" event={"ID":"c706065f-5cf6-4719-96a3-ce442d33c58a","Type":"ContainerStarted","Data":"b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a"} Sep 30 12:26:21 crc kubenswrapper[4672]: I0930 12:26:21.982137 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkhn2" event={"ID":"f337a53e-90b5-44a2-a033-bf26d3498158","Type":"ContainerStarted","Data":"dcb5c6a5e2a18802bffa1ab30ce431784304570eab733f2030fec41ebc156be9"} Sep 30 12:26:22 crc kubenswrapper[4672]: I0930 12:26:22.002293 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khgj2" podStartSLOduration=2.517440698 podStartE2EDuration="5.002255617s" podCreationTimestamp="2025-09-30 12:26:17 +0000 UTC" firstStartedPulling="2025-09-30 12:26:18.911809363 +0000 UTC m=+270.181047009" lastFinishedPulling="2025-09-30 12:26:21.396624282 +0000 UTC m=+272.665861928" observedRunningTime="2025-09-30 12:26:22.00022978 +0000 UTC m=+273.269467436" watchObservedRunningTime="2025-09-30 12:26:22.002255617 +0000 UTC m=+273.271493263" Sep 30 12:26:22 crc kubenswrapper[4672]: E0930 12:26:22.186187 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf337a53e_90b5_44a2_a033_bf26d3498158.slice/crio-dcb5c6a5e2a18802bffa1ab30ce431784304570eab733f2030fec41ebc156be9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf337a53e_90b5_44a2_a033_bf26d3498158.slice/crio-conmon-dcb5c6a5e2a18802bffa1ab30ce431784304570eab733f2030fec41ebc156be9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 12:26:22 crc kubenswrapper[4672]: I0930 12:26:22.990309 4672 generic.go:334] "Generic (PLEG): container finished" podID="a213a95b-eb38-4932-9c67-11f1b91d0202" containerID="cf9da76ac1f043fa43616c5216c9f479e7692d73c52d008b56fb23019a5f8fed" exitCode=0 Sep 30 12:26:22 crc kubenswrapper[4672]: I0930 12:26:22.991121 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n27k6" event={"ID":"a213a95b-eb38-4932-9c67-11f1b91d0202","Type":"ContainerDied","Data":"cf9da76ac1f043fa43616c5216c9f479e7692d73c52d008b56fb23019a5f8fed"} Sep 30 12:26:22 crc kubenswrapper[4672]: I0930 12:26:22.999560 4672 generic.go:334] "Generic (PLEG): container finished" podID="f337a53e-90b5-44a2-a033-bf26d3498158" containerID="dcb5c6a5e2a18802bffa1ab30ce431784304570eab733f2030fec41ebc156be9" exitCode=0 Sep 30 12:26:23 crc kubenswrapper[4672]: I0930 12:26:23.000970 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkhn2" event={"ID":"f337a53e-90b5-44a2-a033-bf26d3498158","Type":"ContainerDied","Data":"dcb5c6a5e2a18802bffa1ab30ce431784304570eab733f2030fec41ebc156be9"} Sep 30 12:26:24 crc kubenswrapper[4672]: I0930 12:26:24.009288 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkhn2" event={"ID":"f337a53e-90b5-44a2-a033-bf26d3498158","Type":"ContainerStarted","Data":"5899ea32ea82e4ff054dfaa20538063ef7ca599d0c9ef1c4cf37580a2c4c641b"} Sep 30 12:26:24 crc kubenswrapper[4672]: I0930 12:26:24.012783 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n27k6" event={"ID":"a213a95b-eb38-4932-9c67-11f1b91d0202","Type":"ContainerStarted","Data":"eb8faae85cb66c7cc0473bf4f3f44bb88738dd636860ac5ed468162386992f5f"} Sep 30 12:26:24 crc kubenswrapper[4672]: I0930 12:26:24.031166 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lkhn2" podStartSLOduration=2.583310094 podStartE2EDuration="5.031144325s" podCreationTimestamp="2025-09-30 12:26:19 +0000 UTC" firstStartedPulling="2025-09-30 12:26:20.972579406 +0000 UTC m=+272.241817052" lastFinishedPulling="2025-09-30 12:26:23.420413637 +0000 UTC m=+274.689651283" observedRunningTime="2025-09-30 12:26:24.028515931 +0000 UTC m=+275.297753587" watchObservedRunningTime="2025-09-30 12:26:24.031144325 +0000 UTC m=+275.300381961" Sep 30 12:26:27 crc kubenswrapper[4672]: I0930 12:26:27.250491 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:27 crc kubenswrapper[4672]: I0930 12:26:27.250970 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:27 crc kubenswrapper[4672]: I0930 12:26:27.300701 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:27 crc kubenswrapper[4672]: I0930 12:26:27.325204 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n27k6" podStartSLOduration=5.770252588 podStartE2EDuration="8.325158974s" podCreationTimestamp="2025-09-30 12:26:19 +0000 UTC" firstStartedPulling="2025-09-30 12:26:20.96841822 +0000 UTC m=+272.237655866" lastFinishedPulling="2025-09-30 12:26:23.523324606 +0000 UTC m=+274.792562252" observedRunningTime="2025-09-30 12:26:24.051157217 +0000 UTC m=+275.320394863" watchObservedRunningTime="2025-09-30 12:26:27.325158974 +0000 UTC m=+278.594396660" Sep 30 12:26:27 crc kubenswrapper[4672]: I0930 12:26:27.447518 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:27 crc kubenswrapper[4672]: I0930 12:26:27.447609 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:27 crc kubenswrapper[4672]: I0930 12:26:27.489329 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:28 crc kubenswrapper[4672]: I0930 12:26:28.083396 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xpjr4" Sep 30 12:26:28 crc kubenswrapper[4672]: I0930 12:26:28.097007 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khgj2" Sep 30 12:26:29 crc kubenswrapper[4672]: I0930 12:26:29.632540 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:29 crc kubenswrapper[4672]: I0930 12:26:29.632602 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:29 crc kubenswrapper[4672]: I0930 12:26:29.674037 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:26:29 crc kubenswrapper[4672]: I0930 12:26:29.840532 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:29 crc kubenswrapper[4672]: I0930 12:26:29.840886 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:29 crc kubenswrapper[4672]: I0930 12:26:29.892023 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:30 crc kubenswrapper[4672]: I0930 12:26:30.103717 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lkhn2" Sep 30 12:26:30 crc kubenswrapper[4672]: I0930 12:26:30.105396 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n27k6" Sep 30 12:27:24 crc kubenswrapper[4672]: I0930 12:27:24.739534 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:27:24 crc kubenswrapper[4672]: I0930 12:27:24.740316 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:27:54 crc kubenswrapper[4672]: I0930 12:27:54.740093 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:27:54 crc kubenswrapper[4672]: I0930 12:27:54.740786 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:28:24 crc kubenswrapper[4672]: I0930 12:28:24.739731 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:28:24 crc kubenswrapper[4672]: I0930 12:28:24.740372 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:28:24 crc kubenswrapper[4672]: I0930 12:28:24.740441 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:28:24 crc kubenswrapper[4672]: I0930 12:28:24.741228 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99d96054a65bb2e9be687c5412fe9d8ef3b73a9fb641b821f2a35007e1e4415e"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:28:24 crc kubenswrapper[4672]: I0930 12:28:24.741330 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://99d96054a65bb2e9be687c5412fe9d8ef3b73a9fb641b821f2a35007e1e4415e" gracePeriod=600 Sep 30 12:28:25 crc kubenswrapper[4672]: I0930 12:28:25.770352 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="99d96054a65bb2e9be687c5412fe9d8ef3b73a9fb641b821f2a35007e1e4415e" exitCode=0 Sep 30 12:28:25 crc kubenswrapper[4672]: I0930 12:28:25.770459 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"99d96054a65bb2e9be687c5412fe9d8ef3b73a9fb641b821f2a35007e1e4415e"} Sep 30 12:28:25 crc kubenswrapper[4672]: I0930 12:28:25.770804 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"24ffe413e2febc5a1c43c44675bb37e97907e267680c76a2a11205a06222f9b4"} Sep 30 12:28:25 crc kubenswrapper[4672]: I0930 12:28:25.770842 4672 scope.go:117] "RemoveContainer" containerID="ba615a53d8f50a3891080fd06384bb6e00eca07025bb60fdf1764d9c1d4a5f76" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.763544 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nstmj"] Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.764950 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.779348 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nstmj"] Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820360 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-registry-tls\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820410 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b4701a7-e7d1-4cce-a463-c3de32ba3583-trusted-ca\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820432 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h52gv\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-kube-api-access-h52gv\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820461 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b4701a7-e7d1-4cce-a463-c3de32ba3583-registry-certificates\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820557 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-bound-sa-token\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820619 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820709 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b4701a7-e7d1-4cce-a463-c3de32ba3583-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.820758 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b4701a7-e7d1-4cce-a463-c3de32ba3583-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.842630 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.922301 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-registry-tls\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.922357 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b4701a7-e7d1-4cce-a463-c3de32ba3583-trusted-ca\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.922378 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h52gv\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-kube-api-access-h52gv\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.922406 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b4701a7-e7d1-4cce-a463-c3de32ba3583-registry-certificates\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.922437 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-bound-sa-token\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.922483 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b4701a7-e7d1-4cce-a463-c3de32ba3583-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.922503 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b4701a7-e7d1-4cce-a463-c3de32ba3583-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.923436 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b4701a7-e7d1-4cce-a463-c3de32ba3583-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.924211 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b4701a7-e7d1-4cce-a463-c3de32ba3583-registry-certificates\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.924476 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b4701a7-e7d1-4cce-a463-c3de32ba3583-trusted-ca\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.928321 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b4701a7-e7d1-4cce-a463-c3de32ba3583-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.930760 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-registry-tls\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.939079 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h52gv\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-kube-api-access-h52gv\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:28 crc kubenswrapper[4672]: I0930 12:29:28.939991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b4701a7-e7d1-4cce-a463-c3de32ba3583-bound-sa-token\") pod \"image-registry-66df7c8f76-nstmj\" (UID: \"4b4701a7-e7d1-4cce-a463-c3de32ba3583\") " pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:29 crc kubenswrapper[4672]: I0930 12:29:29.084621 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:29 crc kubenswrapper[4672]: I0930 12:29:29.340086 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nstmj"] Sep 30 12:29:30 crc kubenswrapper[4672]: I0930 12:29:30.183577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" event={"ID":"4b4701a7-e7d1-4cce-a463-c3de32ba3583","Type":"ContainerStarted","Data":"a9f749280b55786cae827e45be18ceae32a0bf6e6eea99f27b81fdafd412b580"} Sep 30 12:29:30 crc kubenswrapper[4672]: I0930 12:29:30.184047 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:30 crc kubenswrapper[4672]: I0930 12:29:30.184077 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" event={"ID":"4b4701a7-e7d1-4cce-a463-c3de32ba3583","Type":"ContainerStarted","Data":"029ed39f8cd65badc88d78f5cf2e16e3028bcbcb1ab0ad0e1aff2ac8082bb48d"} Sep 30 12:29:30 crc kubenswrapper[4672]: I0930 12:29:30.205892 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" podStartSLOduration=2.205869184 podStartE2EDuration="2.205869184s" podCreationTimestamp="2025-09-30 12:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:29:30.203509265 +0000 UTC m=+461.472746911" watchObservedRunningTime="2025-09-30 12:29:30.205869184 +0000 UTC m=+461.475106830" Sep 30 12:29:49 crc kubenswrapper[4672]: I0930 12:29:49.090609 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-nstmj" Sep 30 12:29:49 crc kubenswrapper[4672]: I0930 12:29:49.141561 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mlrw8"] Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.142609 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm"] Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.144816 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.148125 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.148212 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.157159 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm"] Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.218417 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-config-volume\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.218520 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-secret-volume\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.218850 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttfhj\" (UniqueName: \"kubernetes.io/projected/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-kube-api-access-ttfhj\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.319823 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttfhj\" (UniqueName: \"kubernetes.io/projected/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-kube-api-access-ttfhj\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.319912 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-config-volume\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.319959 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-secret-volume\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.321257 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-config-volume\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.331876 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-secret-volume\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.338413 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttfhj\" (UniqueName: \"kubernetes.io/projected/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-kube-api-access-ttfhj\") pod \"collect-profiles-29320590-97krm\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.465751 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:00 crc kubenswrapper[4672]: I0930 12:30:00.664116 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm"] Sep 30 12:30:01 crc kubenswrapper[4672]: I0930 12:30:01.380382 4672 generic.go:334] "Generic (PLEG): container finished" podID="1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" containerID="a8e7b7f94558a2836baa398db8c9e6afc179de0765c52d96e5a6e7a0d16e67d5" exitCode=0 Sep 30 12:30:01 crc kubenswrapper[4672]: I0930 12:30:01.380463 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" event={"ID":"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554","Type":"ContainerDied","Data":"a8e7b7f94558a2836baa398db8c9e6afc179de0765c52d96e5a6e7a0d16e67d5"} Sep 30 12:30:01 crc kubenswrapper[4672]: I0930 12:30:01.380748 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" event={"ID":"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554","Type":"ContainerStarted","Data":"c4e61c34804222f0972894e719bce1dfb779b3a335cfdf422d139e4f81eecb2a"} Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.631168 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.750859 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-config-volume\") pod \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.751052 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttfhj\" (UniqueName: \"kubernetes.io/projected/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-kube-api-access-ttfhj\") pod \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.751119 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-secret-volume\") pod \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\" (UID: \"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554\") " Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.751823 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" (UID: "1ab57cdf-bd45-48f0-97e9-e6cad9bb6554"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.759959 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" (UID: "1ab57cdf-bd45-48f0-97e9-e6cad9bb6554"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.760543 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-kube-api-access-ttfhj" (OuterVolumeSpecName: "kube-api-access-ttfhj") pod "1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" (UID: "1ab57cdf-bd45-48f0-97e9-e6cad9bb6554"). InnerVolumeSpecName "kube-api-access-ttfhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.853041 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.853090 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:02 crc kubenswrapper[4672]: I0930 12:30:02.853101 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttfhj\" (UniqueName: \"kubernetes.io/projected/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554-kube-api-access-ttfhj\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:03 crc kubenswrapper[4672]: I0930 12:30:03.395908 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" event={"ID":"1ab57cdf-bd45-48f0-97e9-e6cad9bb6554","Type":"ContainerDied","Data":"c4e61c34804222f0972894e719bce1dfb779b3a335cfdf422d139e4f81eecb2a"} Sep 30 12:30:03 crc kubenswrapper[4672]: I0930 12:30:03.395959 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm" Sep 30 12:30:03 crc kubenswrapper[4672]: I0930 12:30:03.395983 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4e61c34804222f0972894e719bce1dfb779b3a335cfdf422d139e4f81eecb2a" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.182829 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" podUID="59e30e9f-8395-4abd-8adf-b964ac1cbb0b" containerName="registry" containerID="cri-o://95c658cf4db73f2f4f16675700967a9012c555267cd52910ee18d6c8d3b0d17f" gracePeriod=30 Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.465027 4672 generic.go:334] "Generic (PLEG): container finished" podID="59e30e9f-8395-4abd-8adf-b964ac1cbb0b" containerID="95c658cf4db73f2f4f16675700967a9012c555267cd52910ee18d6c8d3b0d17f" exitCode=0 Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.465118 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" event={"ID":"59e30e9f-8395-4abd-8adf-b964ac1cbb0b","Type":"ContainerDied","Data":"95c658cf4db73f2f4f16675700967a9012c555267cd52910ee18d6c8d3b0d17f"} Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.556812 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.718930 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-bound-sa-token\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.718997 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-certificates\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.719148 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.719190 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-tls\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.719226 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-ca-trust-extracted\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.719300 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-trusted-ca\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.719365 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drvb8\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-kube-api-access-drvb8\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.719425 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-installation-pull-secrets\") pod \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\" (UID: \"59e30e9f-8395-4abd-8adf-b964ac1cbb0b\") " Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.720266 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.721400 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.725380 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.725971 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-kube-api-access-drvb8" (OuterVolumeSpecName: "kube-api-access-drvb8") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "kube-api-access-drvb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.726188 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.727408 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.730669 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.740083 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "59e30e9f-8395-4abd-8adf-b964ac1cbb0b" (UID: "59e30e9f-8395-4abd-8adf-b964ac1cbb0b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.821051 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drvb8\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-kube-api-access-drvb8\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.821248 4672 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.821316 4672 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.821365 4672 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.821440 4672 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.821488 4672 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:14 crc kubenswrapper[4672]: I0930 12:30:14.821533 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e30e9f-8395-4abd-8adf-b964ac1cbb0b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:30:15 crc kubenswrapper[4672]: I0930 12:30:15.475169 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" event={"ID":"59e30e9f-8395-4abd-8adf-b964ac1cbb0b","Type":"ContainerDied","Data":"046633e0e383dde4b9d62aeb6a13873e816c6bdb74bde3cb06316a537544f5ba"} Sep 30 12:30:15 crc kubenswrapper[4672]: I0930 12:30:15.475283 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mlrw8" Sep 30 12:30:15 crc kubenswrapper[4672]: I0930 12:30:15.475707 4672 scope.go:117] "RemoveContainer" containerID="95c658cf4db73f2f4f16675700967a9012c555267cd52910ee18d6c8d3b0d17f" Sep 30 12:30:15 crc kubenswrapper[4672]: I0930 12:30:15.506087 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mlrw8"] Sep 30 12:30:15 crc kubenswrapper[4672]: I0930 12:30:15.509810 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mlrw8"] Sep 30 12:30:17 crc kubenswrapper[4672]: I0930 12:30:17.430443 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e30e9f-8395-4abd-8adf-b964ac1cbb0b" path="/var/lib/kubelet/pods/59e30e9f-8395-4abd-8adf-b964ac1cbb0b/volumes" Sep 30 12:30:49 crc kubenswrapper[4672]: I0930 12:30:49.570670 4672 scope.go:117] "RemoveContainer" containerID="53613a217d27e173b41b9b6f8cabf79cce9bfafd2cc0edc75a59c34679ab1350" Sep 30 12:30:54 crc kubenswrapper[4672]: I0930 12:30:54.739254 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:30:54 crc kubenswrapper[4672]: I0930 12:30:54.740613 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:31:24 crc kubenswrapper[4672]: I0930 12:31:24.740028 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:31:24 crc kubenswrapper[4672]: I0930 12:31:24.740754 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.871629 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zcp7d"] Sep 30 12:31:38 crc kubenswrapper[4672]: E0930 12:31:38.872574 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e30e9f-8395-4abd-8adf-b964ac1cbb0b" containerName="registry" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.872595 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e30e9f-8395-4abd-8adf-b964ac1cbb0b" containerName="registry" Sep 30 12:31:38 crc kubenswrapper[4672]: E0930 12:31:38.872612 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" containerName="collect-profiles" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.872619 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" containerName="collect-profiles" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.872711 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" containerName="collect-profiles" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.872726 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e30e9f-8395-4abd-8adf-b964ac1cbb0b" containerName="registry" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.873118 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.877152 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-46756"] Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.877793 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-46756" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.879612 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.881074 4672 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vcfv2" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.881360 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.881675 4672 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-82s9b" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.883577 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zcp7d"] Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.894822 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-46756"] Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.910964 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4r2wn"] Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.911604 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.913705 4672 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-b4j9l" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.956148 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4r2wn"] Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.975901 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfrw9\" (UniqueName: \"kubernetes.io/projected/3c4da14a-ed20-4c47-8147-2150a416c1c8-kube-api-access-rfrw9\") pod \"cert-manager-cainjector-7f985d654d-zcp7d\" (UID: \"3c4da14a-ed20-4c47-8147-2150a416c1c8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.976151 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qm2\" (UniqueName: \"kubernetes.io/projected/b8ab6541-957a-44c8-a773-788f725d7efb-kube-api-access-f5qm2\") pod \"cert-manager-5b446d88c5-46756\" (UID: \"b8ab6541-957a-44c8-a773-788f725d7efb\") " pod="cert-manager/cert-manager-5b446d88c5-46756" Sep 30 12:31:38 crc kubenswrapper[4672]: I0930 12:31:38.976248 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxzq\" (UniqueName: \"kubernetes.io/projected/e53f49ff-ce7a-4699-977e-730d462910c8-kube-api-access-rzxzq\") pod \"cert-manager-webhook-5655c58dd6-4r2wn\" (UID: \"e53f49ff-ce7a-4699-977e-730d462910c8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.077345 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxzq\" (UniqueName: \"kubernetes.io/projected/e53f49ff-ce7a-4699-977e-730d462910c8-kube-api-access-rzxzq\") pod \"cert-manager-webhook-5655c58dd6-4r2wn\" (UID: \"e53f49ff-ce7a-4699-977e-730d462910c8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.077393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfrw9\" (UniqueName: \"kubernetes.io/projected/3c4da14a-ed20-4c47-8147-2150a416c1c8-kube-api-access-rfrw9\") pod \"cert-manager-cainjector-7f985d654d-zcp7d\" (UID: \"3c4da14a-ed20-4c47-8147-2150a416c1c8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.077459 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qm2\" (UniqueName: \"kubernetes.io/projected/b8ab6541-957a-44c8-a773-788f725d7efb-kube-api-access-f5qm2\") pod \"cert-manager-5b446d88c5-46756\" (UID: \"b8ab6541-957a-44c8-a773-788f725d7efb\") " pod="cert-manager/cert-manager-5b446d88c5-46756" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.098822 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qm2\" (UniqueName: \"kubernetes.io/projected/b8ab6541-957a-44c8-a773-788f725d7efb-kube-api-access-f5qm2\") pod \"cert-manager-5b446d88c5-46756\" (UID: \"b8ab6541-957a-44c8-a773-788f725d7efb\") " pod="cert-manager/cert-manager-5b446d88c5-46756" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.101396 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfrw9\" (UniqueName: \"kubernetes.io/projected/3c4da14a-ed20-4c47-8147-2150a416c1c8-kube-api-access-rfrw9\") pod \"cert-manager-cainjector-7f985d654d-zcp7d\" (UID: \"3c4da14a-ed20-4c47-8147-2150a416c1c8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.101400 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxzq\" (UniqueName: \"kubernetes.io/projected/e53f49ff-ce7a-4699-977e-730d462910c8-kube-api-access-rzxzq\") pod \"cert-manager-webhook-5655c58dd6-4r2wn\" (UID: \"e53f49ff-ce7a-4699-977e-730d462910c8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.200067 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.205915 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-46756" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.229094 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.630078 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-zcp7d"] Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.637483 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.699910 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-46756"] Sep 30 12:31:39 crc kubenswrapper[4672]: I0930 12:31:39.702882 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-4r2wn"] Sep 30 12:31:39 crc kubenswrapper[4672]: W0930 12:31:39.704356 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ab6541_957a_44c8_a773_788f725d7efb.slice/crio-830bf89384fcb70cef640cefc2d8849cfb381cfebff7fd6c4aefe22b0c58c8aa WatchSource:0}: Error finding container 830bf89384fcb70cef640cefc2d8849cfb381cfebff7fd6c4aefe22b0c58c8aa: Status 404 returned error can't find the container with id 830bf89384fcb70cef640cefc2d8849cfb381cfebff7fd6c4aefe22b0c58c8aa Sep 30 12:31:39 crc kubenswrapper[4672]: W0930 12:31:39.706288 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode53f49ff_ce7a_4699_977e_730d462910c8.slice/crio-327b8dec948b42aa75f56bae68fa7fc3175ac9d12f711c99a3bc71df47f4a422 WatchSource:0}: Error finding container 327b8dec948b42aa75f56bae68fa7fc3175ac9d12f711c99a3bc71df47f4a422: Status 404 returned error can't find the container with id 327b8dec948b42aa75f56bae68fa7fc3175ac9d12f711c99a3bc71df47f4a422 Sep 30 12:31:40 crc kubenswrapper[4672]: I0930 12:31:40.018620 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" event={"ID":"e53f49ff-ce7a-4699-977e-730d462910c8","Type":"ContainerStarted","Data":"327b8dec948b42aa75f56bae68fa7fc3175ac9d12f711c99a3bc71df47f4a422"} Sep 30 12:31:40 crc kubenswrapper[4672]: I0930 12:31:40.021362 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-46756" event={"ID":"b8ab6541-957a-44c8-a773-788f725d7efb","Type":"ContainerStarted","Data":"830bf89384fcb70cef640cefc2d8849cfb381cfebff7fd6c4aefe22b0c58c8aa"} Sep 30 12:31:40 crc kubenswrapper[4672]: I0930 12:31:40.024843 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" event={"ID":"3c4da14a-ed20-4c47-8147-2150a416c1c8","Type":"ContainerStarted","Data":"b7bfa493de87adf576d765141bd7a421088970a2b00e902847609b4803692d81"} Sep 30 12:31:44 crc kubenswrapper[4672]: I0930 12:31:44.048002 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-46756" event={"ID":"b8ab6541-957a-44c8-a773-788f725d7efb","Type":"ContainerStarted","Data":"6f6a0663e6de78501e55675fbea686721a93032a83c6d65619d2a4384395a679"} Sep 30 12:31:44 crc kubenswrapper[4672]: I0930 12:31:44.050794 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" event={"ID":"3c4da14a-ed20-4c47-8147-2150a416c1c8","Type":"ContainerStarted","Data":"a8d215cab1d158909b0e7e5f6694291a121ef88b6889cb293caa5895a9b4475a"} Sep 30 12:31:44 crc kubenswrapper[4672]: I0930 12:31:44.053173 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" event={"ID":"e53f49ff-ce7a-4699-977e-730d462910c8","Type":"ContainerStarted","Data":"5130e396ab46bce1833eae0a4dcddae312207eaaa9ded9b78d20b60f2762ec54"} Sep 30 12:31:44 crc kubenswrapper[4672]: I0930 12:31:44.053319 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" Sep 30 12:31:44 crc kubenswrapper[4672]: I0930 12:31:44.086895 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-46756" podStartSLOduration=2.680153565 podStartE2EDuration="6.08687704s" podCreationTimestamp="2025-09-30 12:31:38 +0000 UTC" firstStartedPulling="2025-09-30 12:31:39.707864196 +0000 UTC m=+590.977101842" lastFinishedPulling="2025-09-30 12:31:43.114587671 +0000 UTC m=+594.383825317" observedRunningTime="2025-09-30 12:31:44.071783911 +0000 UTC m=+595.341021557" watchObservedRunningTime="2025-09-30 12:31:44.08687704 +0000 UTC m=+595.356114686" Sep 30 12:31:44 crc kubenswrapper[4672]: I0930 12:31:44.088570 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" podStartSLOduration=2.6887510690000003 podStartE2EDuration="6.08855857s" podCreationTimestamp="2025-09-30 12:31:38 +0000 UTC" firstStartedPulling="2025-09-30 12:31:39.710405486 +0000 UTC m=+590.979643132" lastFinishedPulling="2025-09-30 12:31:43.110212987 +0000 UTC m=+594.379450633" observedRunningTime="2025-09-30 12:31:44.08647817 +0000 UTC m=+595.355715826" watchObservedRunningTime="2025-09-30 12:31:44.08855857 +0000 UTC m=+595.357796226" Sep 30 12:31:44 crc kubenswrapper[4672]: I0930 12:31:44.103970 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-zcp7d" podStartSLOduration=2.543417764 podStartE2EDuration="6.103943376s" podCreationTimestamp="2025-09-30 12:31:38 +0000 UTC" firstStartedPulling="2025-09-30 12:31:39.637216156 +0000 UTC m=+590.906453802" lastFinishedPulling="2025-09-30 12:31:43.197741768 +0000 UTC m=+594.466979414" observedRunningTime="2025-09-30 12:31:44.102396309 +0000 UTC m=+595.371633965" watchObservedRunningTime="2025-09-30 12:31:44.103943376 +0000 UTC m=+595.373181032" Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.976153 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nznsk"] Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.977081 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-controller" containerID="cri-o://bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406" gracePeriod=30 Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.977158 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899" gracePeriod=30 Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.977258 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-node" containerID="cri-o://c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756" gracePeriod=30 Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.977317 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-acl-logging" containerID="cri-o://9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35" gracePeriod=30 Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.977293 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="sbdb" containerID="cri-o://7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196" gracePeriod=30 Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.977449 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="nbdb" containerID="cri-o://114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687" gracePeriod=30 Sep 30 12:31:48 crc kubenswrapper[4672]: I0930 12:31:48.977238 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="northd" containerID="cri-o://b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2" gracePeriod=30 Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.020150 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" containerID="cri-o://6022fd1dd9491a4d7867913e1da93c0ff79f6dd19b56b4a53c65da9f0f690cb0" gracePeriod=30 Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.082408 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/2.log" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.083006 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/1.log" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.083074 4672 generic.go:334] "Generic (PLEG): container finished" podID="6806ff3c-ab3a-402e-b1c5-cc37c0810a65" containerID="c88d61620e022b57cdffca5d4746f467e2a5f011df271eb6002e734d1db576ce" exitCode=2 Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.083119 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerDied","Data":"c88d61620e022b57cdffca5d4746f467e2a5f011df271eb6002e734d1db576ce"} Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.083180 4672 scope.go:117] "RemoveContainer" containerID="5f37b2a15da4c06842d9df2eabe13974fe3e8f8da3ffd7bc297b6f32f446dbc9" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.083939 4672 scope.go:117] "RemoveContainer" containerID="c88d61620e022b57cdffca5d4746f467e2a5f011df271eb6002e734d1db576ce" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.084377 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8q82q_openshift-multus(6806ff3c-ab3a-402e-b1c5-cc37c0810a65)\"" pod="openshift-multus/multus-8q82q" podUID="6806ff3c-ab3a-402e-b1c5-cc37c0810a65" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.231430 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-4r2wn" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.264548 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovnkube-controller/3.log" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.267748 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovn-acl-logging/0.log" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.268330 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nznsk_5da59bc9-84da-42f6-86e9-3399ecf31725/ovn-controller/0.log" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.268757 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.318701 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-openvswitch\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.318932 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319044 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-systemd\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319079 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-netns\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319158 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-slash\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319707 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-env-overrides\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319775 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-node-log\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319820 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-kubelet\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319863 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319885 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-log-socket\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319914 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llgtp\" (UniqueName: \"kubernetes.io/projected/5da59bc9-84da-42f6-86e9-3399ecf31725-kube-api-access-llgtp\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319942 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5da59bc9-84da-42f6-86e9-3399ecf31725-ovn-node-metrics-cert\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319974 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-script-lib\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320004 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-bin\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320036 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-config\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320056 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-netd\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320106 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-ovn\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320126 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-var-lib-openvswitch\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320153 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-ovn-kubernetes\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320184 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-systemd-units\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320215 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-etc-openvswitch\") pod \"5da59bc9-84da-42f6-86e9-3399ecf31725\" (UID: \"5da59bc9-84da-42f6-86e9-3399ecf31725\") " Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320546 4672 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319249 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321384 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-log-socket" (OuterVolumeSpecName: "log-socket") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321423 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321355 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.319395 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-slash" (OuterVolumeSpecName: "host-slash") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320890 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.320945 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321490 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-node-log" (OuterVolumeSpecName: "node-log") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321241 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321340 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321344 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321370 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321441 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321604 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.321877 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322310 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j4l9d"] Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322325 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322565 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-node" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322586 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-node" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322602 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-acl-logging" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322610 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-acl-logging" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322623 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="sbdb" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322631 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="sbdb" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322645 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322653 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322668 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322676 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322687 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322696 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322708 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322716 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322727 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="nbdb" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322735 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="nbdb" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322749 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kubecfg-setup" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322757 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kubecfg-setup" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322770 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322779 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322798 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322809 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.322821 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="northd" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322831 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="northd" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.322968 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323016 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="nbdb" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323030 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323039 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovn-acl-logging" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323050 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-node" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323062 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323072 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323081 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="northd" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323095 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323105 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="sbdb" Sep 30 12:31:49 crc kubenswrapper[4672]: E0930 12:31:49.323222 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323233 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323403 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.323416 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" containerName="ovnkube-controller" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.326113 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.326731 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da59bc9-84da-42f6-86e9-3399ecf31725-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.331107 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da59bc9-84da-42f6-86e9-3399ecf31725-kube-api-access-llgtp" (OuterVolumeSpecName: "kube-api-access-llgtp") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "kube-api-access-llgtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.341815 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5da59bc9-84da-42f6-86e9-3399ecf31725" (UID: "5da59bc9-84da-42f6-86e9-3399ecf31725"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.421556 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-systemd-units\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422111 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rhbh\" (UniqueName: \"kubernetes.io/projected/8741dbc9-838f-450f-b8e5-b3049748dc63-kube-api-access-5rhbh\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422153 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-systemd\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422241 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-cni-netd\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422332 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-run-ovn-kubernetes\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422397 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-ovnkube-config\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422437 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-kubelet\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422475 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-slash\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422505 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-ovnkube-script-lib\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422553 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-var-lib-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422598 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422634 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-run-netns\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422654 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422678 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-ovn\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422701 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-env-overrides\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422776 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-etc-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422878 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8741dbc9-838f-450f-b8e5-b3049748dc63-ovn-node-metrics-cert\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.422954 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-cni-bin\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423065 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-node-log\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423158 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-log-socket\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423300 4672 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423336 4672 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423356 4672 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423376 4672 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423395 4672 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423413 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llgtp\" (UniqueName: \"kubernetes.io/projected/5da59bc9-84da-42f6-86e9-3399ecf31725-kube-api-access-llgtp\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423432 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5da59bc9-84da-42f6-86e9-3399ecf31725-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423449 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423469 4672 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423486 4672 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5da59bc9-84da-42f6-86e9-3399ecf31725-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423502 4672 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423522 4672 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423539 4672 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423557 4672 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423573 4672 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423589 4672 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423605 4672 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423621 4672 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.423638 4672 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5da59bc9-84da-42f6-86e9-3399ecf31725-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525147 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-ovnkube-script-lib\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525183 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-var-lib-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525218 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525237 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-run-netns\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525252 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525287 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-ovn\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525305 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-env-overrides\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525322 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-etc-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525340 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8741dbc9-838f-450f-b8e5-b3049748dc63-ovn-node-metrics-cert\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525360 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-cni-bin\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525406 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-node-log\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525424 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-log-socket\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-run-netns\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525447 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-systemd-units\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525497 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-cni-bin\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525536 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-node-log\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525562 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-systemd\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525597 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rhbh\" (UniqueName: \"kubernetes.io/projected/8741dbc9-838f-450f-b8e5-b3049748dc63-kube-api-access-5rhbh\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525614 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-cni-netd\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525643 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-run-ovn-kubernetes\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525680 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-ovnkube-config\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525706 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-kubelet\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-systemd\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525794 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-slash\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525774 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-slash\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525977 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525481 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-systemd-units\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526079 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-cni-netd\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526114 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-ovn\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526145 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-run-ovn-kubernetes\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526172 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-run-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526117 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-var-lib-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526458 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-etc-openvswitch\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526742 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-ovnkube-script-lib\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526848 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-env-overrides\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526869 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8741dbc9-838f-450f-b8e5-b3049748dc63-ovnkube-config\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.526883 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-host-kubelet\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.525601 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8741dbc9-838f-450f-b8e5-b3049748dc63-log-socket\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.531409 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8741dbc9-838f-450f-b8e5-b3049748dc63-ovn-node-metrics-cert\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.544355 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rhbh\" (UniqueName: \"kubernetes.io/projected/8741dbc9-838f-450f-b8e5-b3049748dc63-kube-api-access-5rhbh\") pod \"ovnkube-node-j4l9d\" (UID: \"8741dbc9-838f-450f-b8e5-b3049748dc63\") " pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.634910 4672 scope.go:117] "RemoveContainer" containerID="7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.649670 4672 scope.go:117] "RemoveContainer" containerID="9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.664501 4672 scope.go:117] "RemoveContainer" containerID="fc00b1a05de9fc4d6f328ce800afd75975c71d8f64d79a84798a19dcd9882e63" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.664886 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.684608 4672 scope.go:117] "RemoveContainer" containerID="6022fd1dd9491a4d7867913e1da93c0ff79f6dd19b56b4a53c65da9f0f690cb0" Sep 30 12:31:49 crc kubenswrapper[4672]: W0930 12:31:49.693847 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8741dbc9_838f_450f_b8e5_b3049748dc63.slice/crio-1096cd543a2b0f88eea8730be0528cd71f0d57b5ed7f688fe48cc6faaf499d92 WatchSource:0}: Error finding container 1096cd543a2b0f88eea8730be0528cd71f0d57b5ed7f688fe48cc6faaf499d92: Status 404 returned error can't find the container with id 1096cd543a2b0f88eea8730be0528cd71f0d57b5ed7f688fe48cc6faaf499d92 Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.701936 4672 scope.go:117] "RemoveContainer" containerID="b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.724857 4672 scope.go:117] "RemoveContainer" containerID="bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.741720 4672 scope.go:117] "RemoveContainer" containerID="c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.762359 4672 scope.go:117] "RemoveContainer" containerID="4aa532f2ebe19e36977596bd41e8b116c84bcd7d5cd3470986e3192d4d62dea7" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.785928 4672 scope.go:117] "RemoveContainer" containerID="c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756" Sep 30 12:31:49 crc kubenswrapper[4672]: I0930 12:31:49.804481 4672 scope.go:117] "RemoveContainer" containerID="114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687" Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.093840 4672 generic.go:334] "Generic (PLEG): container finished" podID="8741dbc9-838f-450f-b8e5-b3049748dc63" containerID="7e089c09009cb0f18c2a40ac80e4cb9a28b4a881ab2934105d6c26fd8f438838" exitCode=0 Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.093941 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerDied","Data":"7e089c09009cb0f18c2a40ac80e4cb9a28b4a881ab2934105d6c26fd8f438838"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.093978 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"1096cd543a2b0f88eea8730be0528cd71f0d57b5ed7f688fe48cc6faaf499d92"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097379 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/2.log" Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097475 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"6022fd1dd9491a4d7867913e1da93c0ff79f6dd19b56b4a53c65da9f0f690cb0"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097515 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"7058516ce3ff1d1d2929107a66d739fccb24b4f5589a11510ab42fe728a33196"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097532 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"114d67fea982a6c6475e1925f670ba7dae14ace593022f77ef7c6441dde49687"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097550 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"b73fba4d62613a1c27b69a74ebcca91fa6ea43a5267efddf783bd0d243d711f2"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097566 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"c2131429928dbbd60e2d17b7ada4689ff5d161d1df921d74a8b9f71df9e11899"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097582 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"c8e8be814fc5bfb8db3632cda1fcf749a02b642c8e7ff479e59ce5d9f987a756"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097615 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"9e10db0132d159755257aef8c0cee40e5de4a0d3727ce59883e1ff90650e4f35"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097629 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"bb7a32827707f8654d36f56fef7e81aff6b80550ef51f7c090a37f53a0e53406"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097645 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" event={"ID":"5da59bc9-84da-42f6-86e9-3399ecf31725","Type":"ContainerDied","Data":"76994a5d9cde6b9534d626b604ff2f6a0c5f17d81c62a75301d172c331c06d93"} Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.097566 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nznsk" Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.182533 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nznsk"] Sep 30 12:31:50 crc kubenswrapper[4672]: I0930 12:31:50.185347 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nznsk"] Sep 30 12:31:51 crc kubenswrapper[4672]: I0930 12:31:51.105645 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"2bf820cb15c50c65cc6d108e12730a3d5d00e26d341b30804359d5e1e352a314"} Sep 30 12:31:51 crc kubenswrapper[4672]: I0930 12:31:51.105824 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"eeb029ebe942045e825a17e3daf356b7277b1ebc1913c747c44e63d44148e8a2"} Sep 30 12:31:51 crc kubenswrapper[4672]: I0930 12:31:51.105836 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"00cb680211851d2e0fd8873ff3744bcf69ec956a6fcb4803c616ab11a2bf107e"} Sep 30 12:31:51 crc kubenswrapper[4672]: I0930 12:31:51.105845 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"9c776b047083f435de67b69f1b09dd1a2095d1f693923bc4f6ee12082131256e"} Sep 30 12:31:51 crc kubenswrapper[4672]: I0930 12:31:51.105855 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"ac4a43fea8329cf90148467f37a80084097cb2f62a34dbaa99287afa0b38c0a6"} Sep 30 12:31:51 crc kubenswrapper[4672]: I0930 12:31:51.105865 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"c3ae940d8173b51ba1cd6ad09005ac47bdc0d6b5ccce60e980e352400c720941"} Sep 30 12:31:51 crc kubenswrapper[4672]: I0930 12:31:51.434165 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da59bc9-84da-42f6-86e9-3399ecf31725" path="/var/lib/kubelet/pods/5da59bc9-84da-42f6-86e9-3399ecf31725/volumes" Sep 30 12:31:54 crc kubenswrapper[4672]: I0930 12:31:54.129183 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"ab2f471c369af120ab3163c45388fa9edbbbe79d75df539258b0743457fc5d34"} Sep 30 12:31:54 crc kubenswrapper[4672]: I0930 12:31:54.739747 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:31:54 crc kubenswrapper[4672]: I0930 12:31:54.739831 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:31:54 crc kubenswrapper[4672]: I0930 12:31:54.739887 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:31:54 crc kubenswrapper[4672]: I0930 12:31:54.740610 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24ffe413e2febc5a1c43c44675bb37e97907e267680c76a2a11205a06222f9b4"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:31:54 crc kubenswrapper[4672]: I0930 12:31:54.740694 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://24ffe413e2febc5a1c43c44675bb37e97907e267680c76a2a11205a06222f9b4" gracePeriod=600 Sep 30 12:31:55 crc kubenswrapper[4672]: I0930 12:31:55.138154 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="24ffe413e2febc5a1c43c44675bb37e97907e267680c76a2a11205a06222f9b4" exitCode=0 Sep 30 12:31:55 crc kubenswrapper[4672]: I0930 12:31:55.138252 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"24ffe413e2febc5a1c43c44675bb37e97907e267680c76a2a11205a06222f9b4"} Sep 30 12:31:55 crc kubenswrapper[4672]: I0930 12:31:55.138567 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"f277ec5275d8b860ce8dcb4c0f3ecdd12eaede7bb4ef094f520236f925a1f1a1"} Sep 30 12:31:55 crc kubenswrapper[4672]: I0930 12:31:55.138589 4672 scope.go:117] "RemoveContainer" containerID="99d96054a65bb2e9be687c5412fe9d8ef3b73a9fb641b821f2a35007e1e4415e" Sep 30 12:31:56 crc kubenswrapper[4672]: I0930 12:31:56.158487 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" event={"ID":"8741dbc9-838f-450f-b8e5-b3049748dc63","Type":"ContainerStarted","Data":"a4ede2ba95e7f347be22d613149e6a758d2fea4ab416d2db2341fb7ccd44693c"} Sep 30 12:31:56 crc kubenswrapper[4672]: I0930 12:31:56.189106 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" podStartSLOduration=7.189088495 podStartE2EDuration="7.189088495s" podCreationTimestamp="2025-09-30 12:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:31:56.185409538 +0000 UTC m=+607.454647194" watchObservedRunningTime="2025-09-30 12:31:56.189088495 +0000 UTC m=+607.458326141" Sep 30 12:31:57 crc kubenswrapper[4672]: I0930 12:31:57.169946 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:57 crc kubenswrapper[4672]: I0930 12:31:57.170311 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:57 crc kubenswrapper[4672]: I0930 12:31:57.170328 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:57 crc kubenswrapper[4672]: I0930 12:31:57.208691 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:31:57 crc kubenswrapper[4672]: I0930 12:31:57.212126 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:32:00 crc kubenswrapper[4672]: I0930 12:32:00.417029 4672 scope.go:117] "RemoveContainer" containerID="c88d61620e022b57cdffca5d4746f467e2a5f011df271eb6002e734d1db576ce" Sep 30 12:32:00 crc kubenswrapper[4672]: E0930 12:32:00.417964 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8q82q_openshift-multus(6806ff3c-ab3a-402e-b1c5-cc37c0810a65)\"" pod="openshift-multus/multus-8q82q" podUID="6806ff3c-ab3a-402e-b1c5-cc37c0810a65" Sep 30 12:32:13 crc kubenswrapper[4672]: I0930 12:32:13.417904 4672 scope.go:117] "RemoveContainer" containerID="c88d61620e022b57cdffca5d4746f467e2a5f011df271eb6002e734d1db576ce" Sep 30 12:32:14 crc kubenswrapper[4672]: I0930 12:32:14.284378 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8q82q_6806ff3c-ab3a-402e-b1c5-cc37c0810a65/kube-multus/2.log" Sep 30 12:32:14 crc kubenswrapper[4672]: I0930 12:32:14.284654 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8q82q" event={"ID":"6806ff3c-ab3a-402e-b1c5-cc37c0810a65","Type":"ContainerStarted","Data":"a5b644b45b2afa21efbcdd145cc9309aac84243722eb1958ff2daa597466d044"} Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.635033 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k"] Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.636518 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.639422 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.646526 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k"] Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.734610 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqkkn\" (UniqueName: \"kubernetes.io/projected/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-kube-api-access-mqkkn\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.734665 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.734695 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.835894 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqkkn\" (UniqueName: \"kubernetes.io/projected/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-kube-api-access-mqkkn\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.835955 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.835995 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.836474 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.836804 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.855221 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqkkn\" (UniqueName: \"kubernetes.io/projected/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-kube-api-access-mqkkn\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:18 crc kubenswrapper[4672]: I0930 12:32:18.956843 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:19 crc kubenswrapper[4672]: I0930 12:32:19.157690 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k"] Sep 30 12:32:19 crc kubenswrapper[4672]: W0930 12:32:19.162226 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d05223e_1c34_4132_92a1_1b96ef8c1a8b.slice/crio-fa7187c369ea22f41d90b6f889c53085faed84fed2be0d3c8e9d75f5f3f33834 WatchSource:0}: Error finding container fa7187c369ea22f41d90b6f889c53085faed84fed2be0d3c8e9d75f5f3f33834: Status 404 returned error can't find the container with id fa7187c369ea22f41d90b6f889c53085faed84fed2be0d3c8e9d75f5f3f33834 Sep 30 12:32:19 crc kubenswrapper[4672]: I0930 12:32:19.310923 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" event={"ID":"9d05223e-1c34-4132-92a1-1b96ef8c1a8b","Type":"ContainerStarted","Data":"fa7187c369ea22f41d90b6f889c53085faed84fed2be0d3c8e9d75f5f3f33834"} Sep 30 12:32:19 crc kubenswrapper[4672]: I0930 12:32:19.695053 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j4l9d" Sep 30 12:32:20 crc kubenswrapper[4672]: I0930 12:32:20.323521 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" event={"ID":"9d05223e-1c34-4132-92a1-1b96ef8c1a8b","Type":"ContainerStarted","Data":"7106066c2118d16c04ec647e234106cd33231ff635712e98f5897ad92ac343ef"} Sep 30 12:32:21 crc kubenswrapper[4672]: I0930 12:32:21.332348 4672 generic.go:334] "Generic (PLEG): container finished" podID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerID="7106066c2118d16c04ec647e234106cd33231ff635712e98f5897ad92ac343ef" exitCode=0 Sep 30 12:32:21 crc kubenswrapper[4672]: I0930 12:32:21.332401 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" event={"ID":"9d05223e-1c34-4132-92a1-1b96ef8c1a8b","Type":"ContainerDied","Data":"7106066c2118d16c04ec647e234106cd33231ff635712e98f5897ad92ac343ef"} Sep 30 12:32:23 crc kubenswrapper[4672]: I0930 12:32:23.349702 4672 generic.go:334] "Generic (PLEG): container finished" podID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerID="42c84359a0c405f48fdc0ef9d8a8a216693d8264104a05125574f2bb655bc146" exitCode=0 Sep 30 12:32:23 crc kubenswrapper[4672]: I0930 12:32:23.349779 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" event={"ID":"9d05223e-1c34-4132-92a1-1b96ef8c1a8b","Type":"ContainerDied","Data":"42c84359a0c405f48fdc0ef9d8a8a216693d8264104a05125574f2bb655bc146"} Sep 30 12:32:24 crc kubenswrapper[4672]: I0930 12:32:24.362908 4672 generic.go:334] "Generic (PLEG): container finished" podID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerID="140a2a7d2294f1a3d99f80e65feb89230d53036d01f33e93dc7f565bef0b8fcc" exitCode=0 Sep 30 12:32:24 crc kubenswrapper[4672]: I0930 12:32:24.362989 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" event={"ID":"9d05223e-1c34-4132-92a1-1b96ef8c1a8b","Type":"ContainerDied","Data":"140a2a7d2294f1a3d99f80e65feb89230d53036d01f33e93dc7f565bef0b8fcc"} Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.595877 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.726843 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-util\") pod \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.726920 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-bundle\") pod \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.726997 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqkkn\" (UniqueName: \"kubernetes.io/projected/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-kube-api-access-mqkkn\") pod \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\" (UID: \"9d05223e-1c34-4132-92a1-1b96ef8c1a8b\") " Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.729298 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-bundle" (OuterVolumeSpecName: "bundle") pod "9d05223e-1c34-4132-92a1-1b96ef8c1a8b" (UID: "9d05223e-1c34-4132-92a1-1b96ef8c1a8b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.736762 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-kube-api-access-mqkkn" (OuterVolumeSpecName: "kube-api-access-mqkkn") pod "9d05223e-1c34-4132-92a1-1b96ef8c1a8b" (UID: "9d05223e-1c34-4132-92a1-1b96ef8c1a8b"). InnerVolumeSpecName "kube-api-access-mqkkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.738299 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-util" (OuterVolumeSpecName: "util") pod "9d05223e-1c34-4132-92a1-1b96ef8c1a8b" (UID: "9d05223e-1c34-4132-92a1-1b96ef8c1a8b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.829359 4672 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-util\") on node \"crc\" DevicePath \"\"" Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.829407 4672 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:32:25 crc kubenswrapper[4672]: I0930 12:32:25.829425 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqkkn\" (UniqueName: \"kubernetes.io/projected/9d05223e-1c34-4132-92a1-1b96ef8c1a8b-kube-api-access-mqkkn\") on node \"crc\" DevicePath \"\"" Sep 30 12:32:26 crc kubenswrapper[4672]: I0930 12:32:26.380168 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" event={"ID":"9d05223e-1c34-4132-92a1-1b96ef8c1a8b","Type":"ContainerDied","Data":"fa7187c369ea22f41d90b6f889c53085faed84fed2be0d3c8e9d75f5f3f33834"} Sep 30 12:32:26 crc kubenswrapper[4672]: I0930 12:32:26.380568 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa7187c369ea22f41d90b6f889c53085faed84fed2be0d3c8e9d75f5f3f33834" Sep 30 12:32:26 crc kubenswrapper[4672]: I0930 12:32:26.380342 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.696918 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs"] Sep 30 12:32:35 crc kubenswrapper[4672]: E0930 12:32:35.698462 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerName="extract" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.698542 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerName="extract" Sep 30 12:32:35 crc kubenswrapper[4672]: E0930 12:32:35.698610 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerName="pull" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.698660 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerName="pull" Sep 30 12:32:35 crc kubenswrapper[4672]: E0930 12:32:35.698716 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerName="util" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.698767 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerName="util" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.698910 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d05223e-1c34-4132-92a1-1b96ef8c1a8b" containerName="extract" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.699352 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.700938 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-q9lxw" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.702351 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.702545 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.708734 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs"] Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.800069 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4mr\" (UniqueName: \"kubernetes.io/projected/cf46fa05-32de-4c26-82e7-769052afcaa1-kube-api-access-td4mr\") pod \"obo-prometheus-operator-7c8cf85677-k7drs\" (UID: \"cf46fa05-32de-4c26-82e7-769052afcaa1\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.819334 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq"] Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.820354 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.824835 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qkf9c" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.824835 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.835156 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq"] Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.839130 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944"] Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.872729 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944"] Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.872845 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.900940 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95eba142-c439-4920-914e-af904642acc2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq\" (UID: \"95eba142-c439-4920-914e-af904642acc2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.901012 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eaec0e45-413d-4fad-a35b-68a28486053a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944\" (UID: \"eaec0e45-413d-4fad-a35b-68a28486053a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.901052 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95eba142-c439-4920-914e-af904642acc2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq\" (UID: \"95eba142-c439-4920-914e-af904642acc2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.901099 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4mr\" (UniqueName: \"kubernetes.io/projected/cf46fa05-32de-4c26-82e7-769052afcaa1-kube-api-access-td4mr\") pod \"obo-prometheus-operator-7c8cf85677-k7drs\" (UID: \"cf46fa05-32de-4c26-82e7-769052afcaa1\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.901130 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eaec0e45-413d-4fad-a35b-68a28486053a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944\" (UID: \"eaec0e45-413d-4fad-a35b-68a28486053a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:35 crc kubenswrapper[4672]: I0930 12:32:35.926851 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4mr\" (UniqueName: \"kubernetes.io/projected/cf46fa05-32de-4c26-82e7-769052afcaa1-kube-api-access-td4mr\") pod \"obo-prometheus-operator-7c8cf85677-k7drs\" (UID: \"cf46fa05-32de-4c26-82e7-769052afcaa1\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.002222 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eaec0e45-413d-4fad-a35b-68a28486053a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944\" (UID: \"eaec0e45-413d-4fad-a35b-68a28486053a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.002289 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95eba142-c439-4920-914e-af904642acc2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq\" (UID: \"95eba142-c439-4920-914e-af904642acc2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.002328 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eaec0e45-413d-4fad-a35b-68a28486053a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944\" (UID: \"eaec0e45-413d-4fad-a35b-68a28486053a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.002357 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95eba142-c439-4920-914e-af904642acc2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq\" (UID: \"95eba142-c439-4920-914e-af904642acc2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.007991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95eba142-c439-4920-914e-af904642acc2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq\" (UID: \"95eba142-c439-4920-914e-af904642acc2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.009482 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eaec0e45-413d-4fad-a35b-68a28486053a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944\" (UID: \"eaec0e45-413d-4fad-a35b-68a28486053a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.009628 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95eba142-c439-4920-914e-af904642acc2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq\" (UID: \"95eba142-c439-4920-914e-af904642acc2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.009628 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eaec0e45-413d-4fad-a35b-68a28486053a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944\" (UID: \"eaec0e45-413d-4fad-a35b-68a28486053a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.014759 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.038292 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-fzmsr"] Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.039780 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.044193 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-dvn2h" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.044484 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.050585 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-fzmsr"] Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.139763 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.185798 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.204663 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9zv\" (UniqueName: \"kubernetes.io/projected/a4a3a18a-31ce-496c-b863-bdc8ff9774cb-kube-api-access-2w9zv\") pod \"observability-operator-cc5f78dfc-fzmsr\" (UID: \"a4a3a18a-31ce-496c-b863-bdc8ff9774cb\") " pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.204751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4a3a18a-31ce-496c-b863-bdc8ff9774cb-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-fzmsr\" (UID: \"a4a3a18a-31ce-496c-b863-bdc8ff9774cb\") " pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.243078 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-27rgt"] Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.244537 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.262019 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-4zw8q" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.263331 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-27rgt"] Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.306189 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7swm\" (UniqueName: \"kubernetes.io/projected/961561c7-4ef8-4592-bb9a-53ef762e38ea-kube-api-access-j7swm\") pod \"perses-operator-54bc95c9fb-27rgt\" (UID: \"961561c7-4ef8-4592-bb9a-53ef762e38ea\") " pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.306253 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4a3a18a-31ce-496c-b863-bdc8ff9774cb-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-fzmsr\" (UID: \"a4a3a18a-31ce-496c-b863-bdc8ff9774cb\") " pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.306289 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/961561c7-4ef8-4592-bb9a-53ef762e38ea-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-27rgt\" (UID: \"961561c7-4ef8-4592-bb9a-53ef762e38ea\") " pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.306323 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9zv\" (UniqueName: \"kubernetes.io/projected/a4a3a18a-31ce-496c-b863-bdc8ff9774cb-kube-api-access-2w9zv\") pod \"observability-operator-cc5f78dfc-fzmsr\" (UID: \"a4a3a18a-31ce-496c-b863-bdc8ff9774cb\") " pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.311591 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4a3a18a-31ce-496c-b863-bdc8ff9774cb-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-fzmsr\" (UID: \"a4a3a18a-31ce-496c-b863-bdc8ff9774cb\") " pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.330139 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9zv\" (UniqueName: \"kubernetes.io/projected/a4a3a18a-31ce-496c-b863-bdc8ff9774cb-kube-api-access-2w9zv\") pod \"observability-operator-cc5f78dfc-fzmsr\" (UID: \"a4a3a18a-31ce-496c-b863-bdc8ff9774cb\") " pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.392152 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.407676 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7swm\" (UniqueName: \"kubernetes.io/projected/961561c7-4ef8-4592-bb9a-53ef762e38ea-kube-api-access-j7swm\") pod \"perses-operator-54bc95c9fb-27rgt\" (UID: \"961561c7-4ef8-4592-bb9a-53ef762e38ea\") " pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.407744 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/961561c7-4ef8-4592-bb9a-53ef762e38ea-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-27rgt\" (UID: \"961561c7-4ef8-4592-bb9a-53ef762e38ea\") " pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.408810 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/961561c7-4ef8-4592-bb9a-53ef762e38ea-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-27rgt\" (UID: \"961561c7-4ef8-4592-bb9a-53ef762e38ea\") " pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.433994 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7swm\" (UniqueName: \"kubernetes.io/projected/961561c7-4ef8-4592-bb9a-53ef762e38ea-kube-api-access-j7swm\") pod \"perses-operator-54bc95c9fb-27rgt\" (UID: \"961561c7-4ef8-4592-bb9a-53ef762e38ea\") " pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.542709 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs"] Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.583505 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.667550 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-fzmsr"] Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.693448 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq"] Sep 30 12:32:36 crc kubenswrapper[4672]: I0930 12:32:36.846971 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944"] Sep 30 12:32:37 crc kubenswrapper[4672]: I0930 12:32:37.048351 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-27rgt"] Sep 30 12:32:37 crc kubenswrapper[4672]: W0930 12:32:37.060451 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod961561c7_4ef8_4592_bb9a_53ef762e38ea.slice/crio-2062c5c5655bc9d837a5d01467b460c868a82bbe2341764f083e1376c26afa03 WatchSource:0}: Error finding container 2062c5c5655bc9d837a5d01467b460c868a82bbe2341764f083e1376c26afa03: Status 404 returned error can't find the container with id 2062c5c5655bc9d837a5d01467b460c868a82bbe2341764f083e1376c26afa03 Sep 30 12:32:37 crc kubenswrapper[4672]: I0930 12:32:37.445302 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" event={"ID":"eaec0e45-413d-4fad-a35b-68a28486053a","Type":"ContainerStarted","Data":"d3cf28b25ed7fd2b8155646892a8c65109fd540ecd23a5a00e8b607c3d69c715"} Sep 30 12:32:37 crc kubenswrapper[4672]: I0930 12:32:37.446999 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" event={"ID":"cf46fa05-32de-4c26-82e7-769052afcaa1","Type":"ContainerStarted","Data":"6e52feeb0d434eef426cfd717538006e4d8228bc5632eb948e2ff960fd492092"} Sep 30 12:32:37 crc kubenswrapper[4672]: I0930 12:32:37.451521 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" event={"ID":"a4a3a18a-31ce-496c-b863-bdc8ff9774cb","Type":"ContainerStarted","Data":"ae3a8e24235eae441e626624682107dfac256037b00d217ae36e6fb34a74c08d"} Sep 30 12:32:37 crc kubenswrapper[4672]: I0930 12:32:37.461728 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" event={"ID":"961561c7-4ef8-4592-bb9a-53ef762e38ea","Type":"ContainerStarted","Data":"2062c5c5655bc9d837a5d01467b460c868a82bbe2341764f083e1376c26afa03"} Sep 30 12:32:37 crc kubenswrapper[4672]: I0930 12:32:37.474128 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" event={"ID":"95eba142-c439-4920-914e-af904642acc2","Type":"ContainerStarted","Data":"a4d0eaf04643564759637a6c61d58cf1be699dd29fb45ce226486d16e38626c8"} Sep 30 12:32:51 crc kubenswrapper[4672]: E0930 12:32:51.845455 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5" Sep 30 12:32:51 crc kubenswrapper[4672]: E0930 12:32:51.846238 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:8597c48fc71fc6ec8e87dbe40dace4dbb7b817c1039db608af76a0d90f7ac2d0,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-td4mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-7c8cf85677-k7drs_openshift-operators(cf46fa05-32de-4c26-82e7-769052afcaa1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 12:32:51 crc kubenswrapper[4672]: E0930 12:32:51.848553 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" podUID="cf46fa05-32de-4c26-82e7-769052afcaa1" Sep 30 12:32:51 crc kubenswrapper[4672]: E0930 12:32:51.888512 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d" Sep 30 12:32:51 crc kubenswrapper[4672]: E0930 12:32:51.888696 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq_openshift-operators(95eba142-c439-4920-914e-af904642acc2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 12:32:51 crc kubenswrapper[4672]: E0930 12:32:51.890343 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" podUID="95eba142-c439-4920-914e-af904642acc2" Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.593059 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" event={"ID":"eaec0e45-413d-4fad-a35b-68a28486053a","Type":"ContainerStarted","Data":"4968bb3ff769183d0b9a7b5c8a516bbd4808a90c8458c53b0820d8cc5ec56daa"} Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.594606 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" event={"ID":"a4a3a18a-31ce-496c-b863-bdc8ff9774cb","Type":"ContainerStarted","Data":"8f37d1b1cf4d528bba5984be5674187d46c3c30fa2ee0a77aa225a5c01ed5443"} Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.594789 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.598801 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" event={"ID":"961561c7-4ef8-4592-bb9a-53ef762e38ea","Type":"ContainerStarted","Data":"dacc28fde16dc0c863ea289db4c891dd0aa420f124e260fe8268c2d33a6346a4"} Sep 30 12:32:52 crc kubenswrapper[4672]: E0930 12:32:52.600682 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5\\\"\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" podUID="cf46fa05-32de-4c26-82e7-769052afcaa1" Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.631465 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944" podStartSLOduration=2.615155583 podStartE2EDuration="17.631446126s" podCreationTimestamp="2025-09-30 12:32:35 +0000 UTC" firstStartedPulling="2025-09-30 12:32:36.873709608 +0000 UTC m=+648.142947254" lastFinishedPulling="2025-09-30 12:32:51.890000141 +0000 UTC m=+663.159237797" observedRunningTime="2025-09-30 12:32:52.627639329 +0000 UTC m=+663.896876995" watchObservedRunningTime="2025-09-30 12:32:52.631446126 +0000 UTC m=+663.900683782" Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.641935 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.664276 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" podStartSLOduration=1.8394577779999999 podStartE2EDuration="16.664238898s" podCreationTimestamp="2025-09-30 12:32:36 +0000 UTC" firstStartedPulling="2025-09-30 12:32:37.065653862 +0000 UTC m=+648.334891498" lastFinishedPulling="2025-09-30 12:32:51.890434972 +0000 UTC m=+663.159672618" observedRunningTime="2025-09-30 12:32:52.659733493 +0000 UTC m=+663.928971139" watchObservedRunningTime="2025-09-30 12:32:52.664238898 +0000 UTC m=+663.933476534" Sep 30 12:32:52 crc kubenswrapper[4672]: I0930 12:32:52.682857 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-fzmsr" podStartSLOduration=1.472446096 podStartE2EDuration="16.682832189s" podCreationTimestamp="2025-09-30 12:32:36 +0000 UTC" firstStartedPulling="2025-09-30 12:32:36.707792403 +0000 UTC m=+647.977030049" lastFinishedPulling="2025-09-30 12:32:51.918178486 +0000 UTC m=+663.187416142" observedRunningTime="2025-09-30 12:32:52.678730315 +0000 UTC m=+663.947967961" watchObservedRunningTime="2025-09-30 12:32:52.682832189 +0000 UTC m=+663.952069835" Sep 30 12:32:53 crc kubenswrapper[4672]: I0930 12:32:53.605970 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" event={"ID":"95eba142-c439-4920-914e-af904642acc2","Type":"ContainerStarted","Data":"cca410c6a3c3bc6605532c0f5d284d8686a55dbe6a4a3e9f14cfd7fd4662566f"} Sep 30 12:32:53 crc kubenswrapper[4672]: I0930 12:32:53.606439 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:32:53 crc kubenswrapper[4672]: I0930 12:32:53.625098 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq" podStartSLOduration=-9223372018.2297 podStartE2EDuration="18.625075736s" podCreationTimestamp="2025-09-30 12:32:35 +0000 UTC" firstStartedPulling="2025-09-30 12:32:36.746366991 +0000 UTC m=+648.015604637" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:32:53.622147802 +0000 UTC m=+664.891385448" watchObservedRunningTime="2025-09-30 12:32:53.625075736 +0000 UTC m=+664.894313382" Sep 30 12:33:05 crc kubenswrapper[4672]: I0930 12:33:05.671960 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" event={"ID":"cf46fa05-32de-4c26-82e7-769052afcaa1","Type":"ContainerStarted","Data":"d97698f0c78d8432b6fd2d8b86a7e665e5891e885f3daaa5388ba49d290f9260"} Sep 30 12:33:05 crc kubenswrapper[4672]: I0930 12:33:05.695229 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-k7drs" podStartSLOduration=2.468162275 podStartE2EDuration="30.695207397s" podCreationTimestamp="2025-09-30 12:32:35 +0000 UTC" firstStartedPulling="2025-09-30 12:32:36.563370829 +0000 UTC m=+647.832608465" lastFinishedPulling="2025-09-30 12:33:04.790415941 +0000 UTC m=+676.059653587" observedRunningTime="2025-09-30 12:33:05.689459281 +0000 UTC m=+676.958696937" watchObservedRunningTime="2025-09-30 12:33:05.695207397 +0000 UTC m=+676.964445043" Sep 30 12:33:06 crc kubenswrapper[4672]: I0930 12:33:06.587303 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-27rgt" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.727746 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb"] Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.729289 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.731339 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.755881 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb"] Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.776748 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt78x\" (UniqueName: \"kubernetes.io/projected/138ae4d9-a29c-4679-b3fa-7953a95cee51-kube-api-access-rt78x\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.776823 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.776865 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.878064 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt78x\" (UniqueName: \"kubernetes.io/projected/138ae4d9-a29c-4679-b3fa-7953a95cee51-kube-api-access-rt78x\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.878117 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.878146 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.878613 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.878809 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:23 crc kubenswrapper[4672]: I0930 12:33:23.900909 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt78x\" (UniqueName: \"kubernetes.io/projected/138ae4d9-a29c-4679-b3fa-7953a95cee51-kube-api-access-rt78x\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:24 crc kubenswrapper[4672]: I0930 12:33:24.046160 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:24 crc kubenswrapper[4672]: I0930 12:33:24.512529 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb"] Sep 30 12:33:24 crc kubenswrapper[4672]: I0930 12:33:24.795047 4672 generic.go:334] "Generic (PLEG): container finished" podID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerID="25d98950128b28dd02eab5cf623e7ab5ed499dfc4deb066d9fb0a5fc02b42d2e" exitCode=0 Sep 30 12:33:24 crc kubenswrapper[4672]: I0930 12:33:24.795106 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" event={"ID":"138ae4d9-a29c-4679-b3fa-7953a95cee51","Type":"ContainerDied","Data":"25d98950128b28dd02eab5cf623e7ab5ed499dfc4deb066d9fb0a5fc02b42d2e"} Sep 30 12:33:24 crc kubenswrapper[4672]: I0930 12:33:24.795140 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" event={"ID":"138ae4d9-a29c-4679-b3fa-7953a95cee51","Type":"ContainerStarted","Data":"70e71b542e834557ba689990445b65775d88864ac45246bfd045574481dd5371"} Sep 30 12:33:26 crc kubenswrapper[4672]: I0930 12:33:26.811289 4672 generic.go:334] "Generic (PLEG): container finished" podID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerID="ff8ab6f7bd7642bb4762900a81aa295a213216959fe55c145b7687afd7683b58" exitCode=0 Sep 30 12:33:26 crc kubenswrapper[4672]: I0930 12:33:26.811361 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" event={"ID":"138ae4d9-a29c-4679-b3fa-7953a95cee51","Type":"ContainerDied","Data":"ff8ab6f7bd7642bb4762900a81aa295a213216959fe55c145b7687afd7683b58"} Sep 30 12:33:27 crc kubenswrapper[4672]: I0930 12:33:27.820739 4672 generic.go:334] "Generic (PLEG): container finished" podID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerID="f05a26551c948bbb760e58a9950f72be5213e0d4fcf53144eda3f1563c2993e6" exitCode=0 Sep 30 12:33:27 crc kubenswrapper[4672]: I0930 12:33:27.821098 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" event={"ID":"138ae4d9-a29c-4679-b3fa-7953a95cee51","Type":"ContainerDied","Data":"f05a26551c948bbb760e58a9950f72be5213e0d4fcf53144eda3f1563c2993e6"} Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.080549 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.152937 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-bundle\") pod \"138ae4d9-a29c-4679-b3fa-7953a95cee51\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.153009 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-util\") pod \"138ae4d9-a29c-4679-b3fa-7953a95cee51\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.153061 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt78x\" (UniqueName: \"kubernetes.io/projected/138ae4d9-a29c-4679-b3fa-7953a95cee51-kube-api-access-rt78x\") pod \"138ae4d9-a29c-4679-b3fa-7953a95cee51\" (UID: \"138ae4d9-a29c-4679-b3fa-7953a95cee51\") " Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.154047 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-bundle" (OuterVolumeSpecName: "bundle") pod "138ae4d9-a29c-4679-b3fa-7953a95cee51" (UID: "138ae4d9-a29c-4679-b3fa-7953a95cee51"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.159481 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138ae4d9-a29c-4679-b3fa-7953a95cee51-kube-api-access-rt78x" (OuterVolumeSpecName: "kube-api-access-rt78x") pod "138ae4d9-a29c-4679-b3fa-7953a95cee51" (UID: "138ae4d9-a29c-4679-b3fa-7953a95cee51"). InnerVolumeSpecName "kube-api-access-rt78x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.181461 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-util" (OuterVolumeSpecName: "util") pod "138ae4d9-a29c-4679-b3fa-7953a95cee51" (UID: "138ae4d9-a29c-4679-b3fa-7953a95cee51"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.254506 4672 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.254552 4672 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/138ae4d9-a29c-4679-b3fa-7953a95cee51-util\") on node \"crc\" DevicePath \"\"" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.254564 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt78x\" (UniqueName: \"kubernetes.io/projected/138ae4d9-a29c-4679-b3fa-7953a95cee51-kube-api-access-rt78x\") on node \"crc\" DevicePath \"\"" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.837530 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" event={"ID":"138ae4d9-a29c-4679-b3fa-7953a95cee51","Type":"ContainerDied","Data":"70e71b542e834557ba689990445b65775d88864ac45246bfd045574481dd5371"} Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.837572 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e71b542e834557ba689990445b65775d88864ac45246bfd045574481dd5371" Sep 30 12:33:29 crc kubenswrapper[4672]: I0930 12:33:29.837649 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb" Sep 30 12:33:30 crc kubenswrapper[4672]: I0930 12:33:30.995923 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5"] Sep 30 12:33:30 crc kubenswrapper[4672]: E0930 12:33:30.996446 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerName="util" Sep 30 12:33:30 crc kubenswrapper[4672]: I0930 12:33:30.996458 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerName="util" Sep 30 12:33:30 crc kubenswrapper[4672]: E0930 12:33:30.996476 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerName="pull" Sep 30 12:33:30 crc kubenswrapper[4672]: I0930 12:33:30.996482 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerName="pull" Sep 30 12:33:30 crc kubenswrapper[4672]: E0930 12:33:30.996493 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerName="extract" Sep 30 12:33:30 crc kubenswrapper[4672]: I0930 12:33:30.996499 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerName="extract" Sep 30 12:33:30 crc kubenswrapper[4672]: I0930 12:33:30.996590 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="138ae4d9-a29c-4679-b3fa-7953a95cee51" containerName="extract" Sep 30 12:33:30 crc kubenswrapper[4672]: I0930 12:33:30.997004 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" Sep 30 12:33:30 crc kubenswrapper[4672]: I0930 12:33:30.999130 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-x54tf" Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.005935 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5"] Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.009927 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.010241 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.078098 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqtzd\" (UniqueName: \"kubernetes.io/projected/845f56f6-18db-419a-9901-e2c4c186ad88-kube-api-access-nqtzd\") pod \"nmstate-operator-5d6f6cfd66-qn8f5\" (UID: \"845f56f6-18db-419a-9901-e2c4c186ad88\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.179829 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqtzd\" (UniqueName: \"kubernetes.io/projected/845f56f6-18db-419a-9901-e2c4c186ad88-kube-api-access-nqtzd\") pod \"nmstate-operator-5d6f6cfd66-qn8f5\" (UID: \"845f56f6-18db-419a-9901-e2c4c186ad88\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.195528 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqtzd\" (UniqueName: \"kubernetes.io/projected/845f56f6-18db-419a-9901-e2c4c186ad88-kube-api-access-nqtzd\") pod \"nmstate-operator-5d6f6cfd66-qn8f5\" (UID: \"845f56f6-18db-419a-9901-e2c4c186ad88\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.310753 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.516916 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5"] Sep 30 12:33:31 crc kubenswrapper[4672]: I0930 12:33:31.850699 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" event={"ID":"845f56f6-18db-419a-9901-e2c4c186ad88","Type":"ContainerStarted","Data":"e62a85d7573f2b016c7347e0182ea4814c3d032a18b7973e273900cddc7bcc17"} Sep 30 12:33:33 crc kubenswrapper[4672]: I0930 12:33:33.862861 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" event={"ID":"845f56f6-18db-419a-9901-e2c4c186ad88","Type":"ContainerStarted","Data":"2b88d743c80deb41de06b93835c8d7dc52c52593f76837dc54e72329863c15c2"} Sep 30 12:33:33 crc kubenswrapper[4672]: I0930 12:33:33.878928 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qn8f5" podStartSLOduration=1.739686882 podStartE2EDuration="3.878906258s" podCreationTimestamp="2025-09-30 12:33:30 +0000 UTC" firstStartedPulling="2025-09-30 12:33:31.532473057 +0000 UTC m=+702.801710703" lastFinishedPulling="2025-09-30 12:33:33.671692443 +0000 UTC m=+704.940930079" observedRunningTime="2025-09-30 12:33:33.87821163 +0000 UTC m=+705.147449276" watchObservedRunningTime="2025-09-30 12:33:33.878906258 +0000 UTC m=+705.148143914" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.046761 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tqkht"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.047716 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.050008 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fgn6m" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.059255 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tqkht"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.076480 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-4plc5"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.077347 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.078983 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.118683 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-d6nbx"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.119654 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.136324 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-4plc5"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.151165 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-nmstate-lock\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.151217 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jx2w\" (UniqueName: \"kubernetes.io/projected/6573b5fd-c58a-4016-84ad-a21aa5622e2a-kube-api-access-4jx2w\") pod \"nmstate-webhook-6d689559c5-4plc5\" (UID: \"6573b5fd-c58a-4016-84ad-a21aa5622e2a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.151298 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-dbus-socket\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.151327 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-ovs-socket\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.151363 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvxn\" (UniqueName: \"kubernetes.io/projected/fa1a3970-c37a-4cdf-ba19-b868c581c02e-kube-api-access-hlvxn\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.151443 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvxf\" (UniqueName: \"kubernetes.io/projected/a767db76-9b74-4ab7-a541-5f5981850723-kube-api-access-4gvxf\") pod \"nmstate-metrics-58fcddf996-tqkht\" (UID: \"a767db76-9b74-4ab7-a541-5f5981850723\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.151479 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6573b5fd-c58a-4016-84ad-a21aa5622e2a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4plc5\" (UID: \"6573b5fd-c58a-4016-84ad-a21aa5622e2a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.241549 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.242450 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.245422 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.245423 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6tn2p" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.245520 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.253896 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvxf\" (UniqueName: \"kubernetes.io/projected/a767db76-9b74-4ab7-a541-5f5981850723-kube-api-access-4gvxf\") pod \"nmstate-metrics-58fcddf996-tqkht\" (UID: \"a767db76-9b74-4ab7-a541-5f5981850723\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.253945 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6573b5fd-c58a-4016-84ad-a21aa5622e2a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4plc5\" (UID: \"6573b5fd-c58a-4016-84ad-a21aa5622e2a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.253985 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-nmstate-lock\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.254011 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jx2w\" (UniqueName: \"kubernetes.io/projected/6573b5fd-c58a-4016-84ad-a21aa5622e2a-kube-api-access-4jx2w\") pod \"nmstate-webhook-6d689559c5-4plc5\" (UID: \"6573b5fd-c58a-4016-84ad-a21aa5622e2a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.254033 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-dbus-socket\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.254048 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-ovs-socket\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.254077 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvxn\" (UniqueName: \"kubernetes.io/projected/fa1a3970-c37a-4cdf-ba19-b868c581c02e-kube-api-access-hlvxn\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.254376 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt"] Sep 30 12:33:35 crc kubenswrapper[4672]: E0930 12:33:35.254554 4672 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 12:33:35 crc kubenswrapper[4672]: E0930 12:33:35.254602 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6573b5fd-c58a-4016-84ad-a21aa5622e2a-tls-key-pair podName:6573b5fd-c58a-4016-84ad-a21aa5622e2a nodeName:}" failed. No retries permitted until 2025-09-30 12:33:35.754585917 +0000 UTC m=+707.023823563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/6573b5fd-c58a-4016-84ad-a21aa5622e2a-tls-key-pair") pod "nmstate-webhook-6d689559c5-4plc5" (UID: "6573b5fd-c58a-4016-84ad-a21aa5622e2a") : secret "openshift-nmstate-webhook" not found Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.254715 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-nmstate-lock\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.255064 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-dbus-socket\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.255095 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fa1a3970-c37a-4cdf-ba19-b868c581c02e-ovs-socket\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.277758 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvxn\" (UniqueName: \"kubernetes.io/projected/fa1a3970-c37a-4cdf-ba19-b868c581c02e-kube-api-access-hlvxn\") pod \"nmstate-handler-d6nbx\" (UID: \"fa1a3970-c37a-4cdf-ba19-b868c581c02e\") " pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.277915 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvxf\" (UniqueName: \"kubernetes.io/projected/a767db76-9b74-4ab7-a541-5f5981850723-kube-api-access-4gvxf\") pod \"nmstate-metrics-58fcddf996-tqkht\" (UID: \"a767db76-9b74-4ab7-a541-5f5981850723\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.279913 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jx2w\" (UniqueName: \"kubernetes.io/projected/6573b5fd-c58a-4016-84ad-a21aa5622e2a-kube-api-access-4jx2w\") pod \"nmstate-webhook-6d689559c5-4plc5\" (UID: \"6573b5fd-c58a-4016-84ad-a21aa5622e2a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.355207 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.355277 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.355301 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trs66\" (UniqueName: \"kubernetes.io/projected/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-kube-api-access-trs66\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.412317 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.435039 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.457219 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.457309 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.457332 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trs66\" (UniqueName: \"kubernetes.io/projected/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-kube-api-access-trs66\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.458743 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.474949 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.479610 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7498646b45-d9fzd"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.481253 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.491801 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trs66\" (UniqueName: \"kubernetes.io/projected/8394a3ed-db36-4579-ac2b-8e3f1ce579d1-kube-api-access-trs66\") pod \"nmstate-console-plugin-864bb6dfb5-fphxt\" (UID: \"8394a3ed-db36-4579-ac2b-8e3f1ce579d1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.492387 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7498646b45-d9fzd"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.558188 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-trusted-ca-bundle\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.558235 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mblvr\" (UniqueName: \"kubernetes.io/projected/7e3b5d4d-263d-46b8-8943-469ec9c62de4-kube-api-access-mblvr\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.558434 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-config\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.558516 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-serving-cert\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.558618 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-service-ca\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.558689 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-oauth-serving-cert\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.558892 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-oauth-config\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.571906 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.660913 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-oauth-config\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.660978 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-trusted-ca-bundle\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.661003 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mblvr\" (UniqueName: \"kubernetes.io/projected/7e3b5d4d-263d-46b8-8943-469ec9c62de4-kube-api-access-mblvr\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.661032 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-config\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.661083 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-serving-cert\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.661146 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-service-ca\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.661171 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-oauth-serving-cert\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.662422 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-config\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.662441 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-oauth-serving-cert\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.662680 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-service-ca\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.664026 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e3b5d4d-263d-46b8-8943-469ec9c62de4-trusted-ca-bundle\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.665134 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-oauth-config\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.667328 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3b5d4d-263d-46b8-8943-469ec9c62de4-console-serving-cert\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.684356 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mblvr\" (UniqueName: \"kubernetes.io/projected/7e3b5d4d-263d-46b8-8943-469ec9c62de4-kube-api-access-mblvr\") pod \"console-7498646b45-d9fzd\" (UID: \"7e3b5d4d-263d-46b8-8943-469ec9c62de4\") " pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.762575 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6573b5fd-c58a-4016-84ad-a21aa5622e2a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4plc5\" (UID: \"6573b5fd-c58a-4016-84ad-a21aa5622e2a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.766039 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6573b5fd-c58a-4016-84ad-a21aa5622e2a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4plc5\" (UID: \"6573b5fd-c58a-4016-84ad-a21aa5622e2a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.835316 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.862392 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt"] Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.880772 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d6nbx" event={"ID":"fa1a3970-c37a-4cdf-ba19-b868c581c02e","Type":"ContainerStarted","Data":"778a0a5b4c4c4e02c9ef28be862834fe78043c5c620310f57e90615bf8be98ea"} Sep 30 12:33:35 crc kubenswrapper[4672]: I0930 12:33:35.930981 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tqkht"] Sep 30 12:33:35 crc kubenswrapper[4672]: W0930 12:33:35.947518 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda767db76_9b74_4ab7_a541_5f5981850723.slice/crio-5e4fbae2a7b932ec75cd874a7770dece00b56016aa1c8b041727192bb86fe733 WatchSource:0}: Error finding container 5e4fbae2a7b932ec75cd874a7770dece00b56016aa1c8b041727192bb86fe733: Status 404 returned error can't find the container with id 5e4fbae2a7b932ec75cd874a7770dece00b56016aa1c8b041727192bb86fe733 Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.020034 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.056777 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7498646b45-d9fzd"] Sep 30 12:33:36 crc kubenswrapper[4672]: W0930 12:33:36.078855 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3b5d4d_263d_46b8_8943_469ec9c62de4.slice/crio-855b8603dde57bf9d3add5def633dc2eba9ee69dc83f72ce453b7e3b0492a704 WatchSource:0}: Error finding container 855b8603dde57bf9d3add5def633dc2eba9ee69dc83f72ce453b7e3b0492a704: Status 404 returned error can't find the container with id 855b8603dde57bf9d3add5def633dc2eba9ee69dc83f72ce453b7e3b0492a704 Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.236335 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-4plc5"] Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.886976 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" event={"ID":"6573b5fd-c58a-4016-84ad-a21aa5622e2a","Type":"ContainerStarted","Data":"0ad997a2c4d9f38c7f54da40a2d3aea88cf10db13981a41d61c1688cbc35f032"} Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.888283 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" event={"ID":"a767db76-9b74-4ab7-a541-5f5981850723","Type":"ContainerStarted","Data":"5e4fbae2a7b932ec75cd874a7770dece00b56016aa1c8b041727192bb86fe733"} Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.889171 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" event={"ID":"8394a3ed-db36-4579-ac2b-8e3f1ce579d1","Type":"ContainerStarted","Data":"8269287e941f251d2481e1666a0bc8ff2038a2967d78b0d205d0ef44e969ca13"} Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.890636 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7498646b45-d9fzd" event={"ID":"7e3b5d4d-263d-46b8-8943-469ec9c62de4","Type":"ContainerStarted","Data":"7bf4a7b82684ab229547b79af0cee8929aa549abdbbd37e46e04dac6de917c49"} Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.890683 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7498646b45-d9fzd" event={"ID":"7e3b5d4d-263d-46b8-8943-469ec9c62de4","Type":"ContainerStarted","Data":"855b8603dde57bf9d3add5def633dc2eba9ee69dc83f72ce453b7e3b0492a704"} Sep 30 12:33:36 crc kubenswrapper[4672]: I0930 12:33:36.909934 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7498646b45-d9fzd" podStartSLOduration=1.9099153100000001 podStartE2EDuration="1.90991531s" podCreationTimestamp="2025-09-30 12:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:33:36.909204192 +0000 UTC m=+708.178441868" watchObservedRunningTime="2025-09-30 12:33:36.90991531 +0000 UTC m=+708.179152956" Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.927314 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-d6nbx" event={"ID":"fa1a3970-c37a-4cdf-ba19-b868c581c02e","Type":"ContainerStarted","Data":"2ad411b3e70013e29b9ff749d34e3c7236221f8e3ce846fa9655e4aecccede0f"} Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.928149 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.928959 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" event={"ID":"a767db76-9b74-4ab7-a541-5f5981850723","Type":"ContainerStarted","Data":"fdde27a98b3ef14c858b106fa6b651ef58c5bff9d239a4fb110ec440c94bb9e5"} Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.930598 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" event={"ID":"8394a3ed-db36-4579-ac2b-8e3f1ce579d1","Type":"ContainerStarted","Data":"fe04436bc0e080c5a0d72c3f8d7fd4bbcb20cd9a6b07802a6652872c5f627f1d"} Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.932692 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" event={"ID":"6573b5fd-c58a-4016-84ad-a21aa5622e2a","Type":"ContainerStarted","Data":"aa24101173eaeb4005e50ae91599d8373344bc9d28cfe2d6f48e97f3480c94f3"} Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.933038 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.950637 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-d6nbx" podStartSLOduration=1.7610263339999999 podStartE2EDuration="4.950614428s" podCreationTimestamp="2025-09-30 12:33:35 +0000 UTC" firstStartedPulling="2025-09-30 12:33:35.48574455 +0000 UTC m=+706.754982196" lastFinishedPulling="2025-09-30 12:33:38.675332644 +0000 UTC m=+709.944570290" observedRunningTime="2025-09-30 12:33:39.943318383 +0000 UTC m=+711.212556029" watchObservedRunningTime="2025-09-30 12:33:39.950614428 +0000 UTC m=+711.219852074" Sep 30 12:33:39 crc kubenswrapper[4672]: I0930 12:33:39.962837 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" podStartSLOduration=2.526353075 podStartE2EDuration="4.962821288s" podCreationTimestamp="2025-09-30 12:33:35 +0000 UTC" firstStartedPulling="2025-09-30 12:33:36.243636452 +0000 UTC m=+707.512874108" lastFinishedPulling="2025-09-30 12:33:38.680104675 +0000 UTC m=+709.949342321" observedRunningTime="2025-09-30 12:33:39.961213677 +0000 UTC m=+711.230451323" watchObservedRunningTime="2025-09-30 12:33:39.962821288 +0000 UTC m=+711.232058934" Sep 30 12:33:41 crc kubenswrapper[4672]: I0930 12:33:41.950997 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" event={"ID":"a767db76-9b74-4ab7-a541-5f5981850723","Type":"ContainerStarted","Data":"7edf602671ef1abe0df73c3f1b11ee3fafc3c7c8ec4d12e05b9b371b68161a10"} Sep 30 12:33:41 crc kubenswrapper[4672]: I0930 12:33:41.980393 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-fphxt" podStartSLOduration=4.178725152 podStartE2EDuration="6.980361306s" podCreationTimestamp="2025-09-30 12:33:35 +0000 UTC" firstStartedPulling="2025-09-30 12:33:35.873441963 +0000 UTC m=+707.142679609" lastFinishedPulling="2025-09-30 12:33:38.675078117 +0000 UTC m=+709.944315763" observedRunningTime="2025-09-30 12:33:40.020990573 +0000 UTC m=+711.290228219" watchObservedRunningTime="2025-09-30 12:33:41.980361306 +0000 UTC m=+713.249598972" Sep 30 12:33:45 crc kubenswrapper[4672]: I0930 12:33:45.469487 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-d6nbx" Sep 30 12:33:45 crc kubenswrapper[4672]: I0930 12:33:45.498689 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqkht" podStartSLOduration=5.429328818 podStartE2EDuration="10.498641396s" podCreationTimestamp="2025-09-30 12:33:35 +0000 UTC" firstStartedPulling="2025-09-30 12:33:35.952632471 +0000 UTC m=+707.221870117" lastFinishedPulling="2025-09-30 12:33:41.021945049 +0000 UTC m=+712.291182695" observedRunningTime="2025-09-30 12:33:41.979456553 +0000 UTC m=+713.248694279" watchObservedRunningTime="2025-09-30 12:33:45.498641396 +0000 UTC m=+716.767879052" Sep 30 12:33:45 crc kubenswrapper[4672]: I0930 12:33:45.836588 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:45 crc kubenswrapper[4672]: I0930 12:33:45.837665 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:45 crc kubenswrapper[4672]: I0930 12:33:45.846077 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:45 crc kubenswrapper[4672]: I0930 12:33:45.993513 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7498646b45-d9fzd" Sep 30 12:33:46 crc kubenswrapper[4672]: I0930 12:33:46.066723 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-x8stp"] Sep 30 12:33:56 crc kubenswrapper[4672]: I0930 12:33:56.029359 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4plc5" Sep 30 12:34:09 crc kubenswrapper[4672]: I0930 12:34:09.998611 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2"] Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.000387 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.002890 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.010023 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2"] Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.085296 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.085864 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlb76\" (UniqueName: \"kubernetes.io/projected/24166562-adf7-422d-abfa-b1b7176f0124-kube-api-access-nlb76\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.086050 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.187345 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlb76\" (UniqueName: \"kubernetes.io/projected/24166562-adf7-422d-abfa-b1b7176f0124-kube-api-access-nlb76\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.187402 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.187449 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.187925 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.187999 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.207706 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlb76\" (UniqueName: \"kubernetes.io/projected/24166562-adf7-422d-abfa-b1b7176f0124-kube-api-access-nlb76\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.364647 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:10 crc kubenswrapper[4672]: I0930 12:34:10.599351 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2"] Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.125647 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-x8stp" podUID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" containerName="console" containerID="cri-o://219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42" gracePeriod=15 Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.167471 4672 generic.go:334] "Generic (PLEG): container finished" podID="24166562-adf7-422d-abfa-b1b7176f0124" containerID="16a24353e7a794be11d8b4751a62a866f953f9acc064da1260c2d67dc3d5fe6a" exitCode=0 Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.167531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" event={"ID":"24166562-adf7-422d-abfa-b1b7176f0124","Type":"ContainerDied","Data":"16a24353e7a794be11d8b4751a62a866f953f9acc064da1260c2d67dc3d5fe6a"} Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.167848 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" event={"ID":"24166562-adf7-422d-abfa-b1b7176f0124","Type":"ContainerStarted","Data":"0977467c8f45f2b3c620e2f0aa5a3676e42e671cdfaac74fa8261042113849a1"} Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.522045 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-x8stp_6eecbbdb-82a8-4b0d-860d-f6c3f4152a04/console/0.log" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.522112 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.609067 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-oauth-config\") pod \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.609131 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-oauth-serving-cert\") pod \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.609159 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-trusted-ca-bundle\") pod \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.609408 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-serving-cert\") pod \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.609433 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-config\") pod \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.609494 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7stp\" (UniqueName: \"kubernetes.io/projected/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-kube-api-access-k7stp\") pod \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.609565 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-service-ca\") pod \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\" (UID: \"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04\") " Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.610287 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-service-ca" (OuterVolumeSpecName: "service-ca") pod "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" (UID: "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.610320 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" (UID: "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.610306 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-config" (OuterVolumeSpecName: "console-config") pod "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" (UID: "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.610712 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" (UID: "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.614871 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" (UID: "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.614975 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-kube-api-access-k7stp" (OuterVolumeSpecName: "kube-api-access-k7stp") pod "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" (UID: "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04"). InnerVolumeSpecName "kube-api-access-k7stp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.615463 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" (UID: "6eecbbdb-82a8-4b0d-860d-f6c3f4152a04"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.715155 4672 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.715199 4672 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.715208 4672 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.715216 4672 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.715228 4672 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.715236 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7stp\" (UniqueName: \"kubernetes.io/projected/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-kube-api-access-k7stp\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:11 crc kubenswrapper[4672]: I0930 12:34:11.715245 4672 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.178580 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-x8stp_6eecbbdb-82a8-4b0d-860d-f6c3f4152a04/console/0.log" Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.178907 4672 generic.go:334] "Generic (PLEG): container finished" podID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" containerID="219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42" exitCode=2 Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.178943 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x8stp" event={"ID":"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04","Type":"ContainerDied","Data":"219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42"} Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.178975 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-x8stp" event={"ID":"6eecbbdb-82a8-4b0d-860d-f6c3f4152a04","Type":"ContainerDied","Data":"989fb02e2f74f13b0c46e80355643bca415644562e39e30c9bf47f4554bfeb45"} Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.178984 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-x8stp" Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.178996 4672 scope.go:117] "RemoveContainer" containerID="219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42" Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.211611 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-x8stp"] Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.214511 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-x8stp"] Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.263472 4672 scope.go:117] "RemoveContainer" containerID="219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42" Sep 30 12:34:12 crc kubenswrapper[4672]: E0930 12:34:12.263964 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42\": container with ID starting with 219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42 not found: ID does not exist" containerID="219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42" Sep 30 12:34:12 crc kubenswrapper[4672]: I0930 12:34:12.263990 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42"} err="failed to get container status \"219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42\": rpc error: code = NotFound desc = could not find container \"219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42\": container with ID starting with 219877c4491df0e9e329f42dbd27544b69842ff59b113f8d4a027bf770f9ff42 not found: ID does not exist" Sep 30 12:34:13 crc kubenswrapper[4672]: I0930 12:34:13.186252 4672 generic.go:334] "Generic (PLEG): container finished" podID="24166562-adf7-422d-abfa-b1b7176f0124" containerID="b4fb3e304b76c3426e99a73540dfca30f23fe93f96cf05bcc6044e66829d48bb" exitCode=0 Sep 30 12:34:13 crc kubenswrapper[4672]: I0930 12:34:13.186318 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" event={"ID":"24166562-adf7-422d-abfa-b1b7176f0124","Type":"ContainerDied","Data":"b4fb3e304b76c3426e99a73540dfca30f23fe93f96cf05bcc6044e66829d48bb"} Sep 30 12:34:13 crc kubenswrapper[4672]: I0930 12:34:13.425718 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" path="/var/lib/kubelet/pods/6eecbbdb-82a8-4b0d-860d-f6c3f4152a04/volumes" Sep 30 12:34:14 crc kubenswrapper[4672]: I0930 12:34:14.195210 4672 generic.go:334] "Generic (PLEG): container finished" podID="24166562-adf7-422d-abfa-b1b7176f0124" containerID="81b23713c9a5cdff7073c634d2999b1e23a3e837ebac3a68c71bb8af1164c346" exitCode=0 Sep 30 12:34:14 crc kubenswrapper[4672]: I0930 12:34:14.195293 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" event={"ID":"24166562-adf7-422d-abfa-b1b7176f0124","Type":"ContainerDied","Data":"81b23713c9a5cdff7073c634d2999b1e23a3e837ebac3a68c71bb8af1164c346"} Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.512901 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.568775 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlb76\" (UniqueName: \"kubernetes.io/projected/24166562-adf7-422d-abfa-b1b7176f0124-kube-api-access-nlb76\") pod \"24166562-adf7-422d-abfa-b1b7176f0124\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.568844 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-bundle\") pod \"24166562-adf7-422d-abfa-b1b7176f0124\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.568884 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-util\") pod \"24166562-adf7-422d-abfa-b1b7176f0124\" (UID: \"24166562-adf7-422d-abfa-b1b7176f0124\") " Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.570820 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-bundle" (OuterVolumeSpecName: "bundle") pod "24166562-adf7-422d-abfa-b1b7176f0124" (UID: "24166562-adf7-422d-abfa-b1b7176f0124"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.579333 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24166562-adf7-422d-abfa-b1b7176f0124-kube-api-access-nlb76" (OuterVolumeSpecName: "kube-api-access-nlb76") pod "24166562-adf7-422d-abfa-b1b7176f0124" (UID: "24166562-adf7-422d-abfa-b1b7176f0124"). InnerVolumeSpecName "kube-api-access-nlb76". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.590656 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-util" (OuterVolumeSpecName: "util") pod "24166562-adf7-422d-abfa-b1b7176f0124" (UID: "24166562-adf7-422d-abfa-b1b7176f0124"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.670597 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlb76\" (UniqueName: \"kubernetes.io/projected/24166562-adf7-422d-abfa-b1b7176f0124-kube-api-access-nlb76\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.670635 4672 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:15 crc kubenswrapper[4672]: I0930 12:34:15.670643 4672 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24166562-adf7-422d-abfa-b1b7176f0124-util\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:16 crc kubenswrapper[4672]: I0930 12:34:16.207371 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" Sep 30 12:34:16 crc kubenswrapper[4672]: I0930 12:34:16.207384 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2" event={"ID":"24166562-adf7-422d-abfa-b1b7176f0124","Type":"ContainerDied","Data":"0977467c8f45f2b3c620e2f0aa5a3676e42e671cdfaac74fa8261042113849a1"} Sep 30 12:34:16 crc kubenswrapper[4672]: I0930 12:34:16.207424 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0977467c8f45f2b3c620e2f0aa5a3676e42e671cdfaac74fa8261042113849a1" Sep 30 12:34:19 crc kubenswrapper[4672]: I0930 12:34:19.457307 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kxhzb"] Sep 30 12:34:19 crc kubenswrapper[4672]: I0930 12:34:19.457975 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" podUID="bdc37a72-d709-408f-b636-dd62ad023b8d" containerName="controller-manager" containerID="cri-o://ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd" gracePeriod=30 Sep 30 12:34:19 crc kubenswrapper[4672]: I0930 12:34:19.568074 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8"] Sep 30 12:34:19 crc kubenswrapper[4672]: I0930 12:34:19.568341 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" podUID="01f3df1a-e96a-4e9d-9af8-334a144d7cc4" containerName="route-controller-manager" containerID="cri-o://d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb" gracePeriod=30 Sep 30 12:34:19 crc kubenswrapper[4672]: I0930 12:34:19.895763 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.013630 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.028436 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdc37a72-d709-408f-b636-dd62ad023b8d-serving-cert\") pod \"bdc37a72-d709-408f-b636-dd62ad023b8d\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.030583 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-proxy-ca-bundles\") pod \"bdc37a72-d709-408f-b636-dd62ad023b8d\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.030626 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csbmd\" (UniqueName: \"kubernetes.io/projected/bdc37a72-d709-408f-b636-dd62ad023b8d-kube-api-access-csbmd\") pod \"bdc37a72-d709-408f-b636-dd62ad023b8d\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.030677 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-client-ca\") pod \"bdc37a72-d709-408f-b636-dd62ad023b8d\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.030744 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-config\") pod \"bdc37a72-d709-408f-b636-dd62ad023b8d\" (UID: \"bdc37a72-d709-408f-b636-dd62ad023b8d\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.032026 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bdc37a72-d709-408f-b636-dd62ad023b8d" (UID: "bdc37a72-d709-408f-b636-dd62ad023b8d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.032042 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "bdc37a72-d709-408f-b636-dd62ad023b8d" (UID: "bdc37a72-d709-408f-b636-dd62ad023b8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.035374 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-config" (OuterVolumeSpecName: "config") pod "bdc37a72-d709-408f-b636-dd62ad023b8d" (UID: "bdc37a72-d709-408f-b636-dd62ad023b8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.039038 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc37a72-d709-408f-b636-dd62ad023b8d-kube-api-access-csbmd" (OuterVolumeSpecName: "kube-api-access-csbmd") pod "bdc37a72-d709-408f-b636-dd62ad023b8d" (UID: "bdc37a72-d709-408f-b636-dd62ad023b8d"). InnerVolumeSpecName "kube-api-access-csbmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.039321 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc37a72-d709-408f-b636-dd62ad023b8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bdc37a72-d709-408f-b636-dd62ad023b8d" (UID: "bdc37a72-d709-408f-b636-dd62ad023b8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.131673 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sczd4\" (UniqueName: \"kubernetes.io/projected/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-kube-api-access-sczd4\") pod \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.131733 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-config\") pod \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.131798 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-serving-cert\") pod \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.131833 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-client-ca\") pod \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\" (UID: \"01f3df1a-e96a-4e9d-9af8-334a144d7cc4\") " Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.132103 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdc37a72-d709-408f-b636-dd62ad023b8d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.132119 4672 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.132130 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csbmd\" (UniqueName: \"kubernetes.io/projected/bdc37a72-d709-408f-b636-dd62ad023b8d-kube-api-access-csbmd\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.132142 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.132150 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdc37a72-d709-408f-b636-dd62ad023b8d-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.132682 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-client-ca" (OuterVolumeSpecName: "client-ca") pod "01f3df1a-e96a-4e9d-9af8-334a144d7cc4" (UID: "01f3df1a-e96a-4e9d-9af8-334a144d7cc4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.132717 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-config" (OuterVolumeSpecName: "config") pod "01f3df1a-e96a-4e9d-9af8-334a144d7cc4" (UID: "01f3df1a-e96a-4e9d-9af8-334a144d7cc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.136527 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01f3df1a-e96a-4e9d-9af8-334a144d7cc4" (UID: "01f3df1a-e96a-4e9d-9af8-334a144d7cc4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.136610 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-kube-api-access-sczd4" (OuterVolumeSpecName: "kube-api-access-sczd4") pod "01f3df1a-e96a-4e9d-9af8-334a144d7cc4" (UID: "01f3df1a-e96a-4e9d-9af8-334a144d7cc4"). InnerVolumeSpecName "kube-api-access-sczd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.229157 4672 generic.go:334] "Generic (PLEG): container finished" podID="bdc37a72-d709-408f-b636-dd62ad023b8d" containerID="ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd" exitCode=0 Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.229243 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.229230 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" event={"ID":"bdc37a72-d709-408f-b636-dd62ad023b8d","Type":"ContainerDied","Data":"ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd"} Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.229624 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kxhzb" event={"ID":"bdc37a72-d709-408f-b636-dd62ad023b8d","Type":"ContainerDied","Data":"4a8e35f787608e9704e4a2ca0691ff903df2d4dcb6702b4939733ec92e6c144d"} Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.229656 4672 scope.go:117] "RemoveContainer" containerID="ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233062 4672 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233091 4672 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233102 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sczd4\" (UniqueName: \"kubernetes.io/projected/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-kube-api-access-sczd4\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233112 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f3df1a-e96a-4e9d-9af8-334a144d7cc4-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233192 4672 generic.go:334] "Generic (PLEG): container finished" podID="01f3df1a-e96a-4e9d-9af8-334a144d7cc4" containerID="d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb" exitCode=0 Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233227 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" event={"ID":"01f3df1a-e96a-4e9d-9af8-334a144d7cc4","Type":"ContainerDied","Data":"d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb"} Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233256 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" event={"ID":"01f3df1a-e96a-4e9d-9af8-334a144d7cc4","Type":"ContainerDied","Data":"8bf35f955afd73191dc5458efc832caf0d36336c31b956c6a6b9799e4338c4dc"} Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.233237 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.250346 4672 scope.go:117] "RemoveContainer" containerID="ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd" Sep 30 12:34:20 crc kubenswrapper[4672]: E0930 12:34:20.251752 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd\": container with ID starting with ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd not found: ID does not exist" containerID="ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.251882 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd"} err="failed to get container status \"ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd\": rpc error: code = NotFound desc = could not find container \"ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd\": container with ID starting with ffd2f6a06a2aff4928ee9630df15fa41ebc6ecbe6c9be1f4a284a88e2b84fdfd not found: ID does not exist" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.251980 4672 scope.go:117] "RemoveContainer" containerID="d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.261900 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kxhzb"] Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.266124 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kxhzb"] Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.275914 4672 scope.go:117] "RemoveContainer" containerID="d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb" Sep 30 12:34:20 crc kubenswrapper[4672]: E0930 12:34:20.276428 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb\": container with ID starting with d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb not found: ID does not exist" containerID="d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.276592 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb"} err="failed to get container status \"d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb\": rpc error: code = NotFound desc = could not find container \"d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb\": container with ID starting with d3f154d930db4d4361e08b6a569e4910b5149f810382c1f90e0fb4c5d999bcbb not found: ID does not exist" Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.289690 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8"] Sep 30 12:34:20 crc kubenswrapper[4672]: I0930 12:34:20.291558 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-675r8"] Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262390 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b47858c59-vjxmv"] Sep 30 12:34:21 crc kubenswrapper[4672]: E0930 12:34:21.262703 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" containerName="console" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262721 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" containerName="console" Sep 30 12:34:21 crc kubenswrapper[4672]: E0930 12:34:21.262734 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24166562-adf7-422d-abfa-b1b7176f0124" containerName="util" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262741 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="24166562-adf7-422d-abfa-b1b7176f0124" containerName="util" Sep 30 12:34:21 crc kubenswrapper[4672]: E0930 12:34:21.262754 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24166562-adf7-422d-abfa-b1b7176f0124" containerName="pull" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262761 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="24166562-adf7-422d-abfa-b1b7176f0124" containerName="pull" Sep 30 12:34:21 crc kubenswrapper[4672]: E0930 12:34:21.262775 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc37a72-d709-408f-b636-dd62ad023b8d" containerName="controller-manager" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262782 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc37a72-d709-408f-b636-dd62ad023b8d" containerName="controller-manager" Sep 30 12:34:21 crc kubenswrapper[4672]: E0930 12:34:21.262802 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f3df1a-e96a-4e9d-9af8-334a144d7cc4" containerName="route-controller-manager" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262809 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f3df1a-e96a-4e9d-9af8-334a144d7cc4" containerName="route-controller-manager" Sep 30 12:34:21 crc kubenswrapper[4672]: E0930 12:34:21.262821 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24166562-adf7-422d-abfa-b1b7176f0124" containerName="extract" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262827 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="24166562-adf7-422d-abfa-b1b7176f0124" containerName="extract" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262953 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eecbbdb-82a8-4b0d-860d-f6c3f4152a04" containerName="console" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262968 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc37a72-d709-408f-b636-dd62ad023b8d" containerName="controller-manager" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262978 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f3df1a-e96a-4e9d-9af8-334a144d7cc4" containerName="route-controller-manager" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.262989 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="24166562-adf7-422d-abfa-b1b7176f0124" containerName="extract" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.263577 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.271119 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.271556 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.271566 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.271665 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.271656 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.274408 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.282017 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8"] Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.283698 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.285694 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.286514 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.286972 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.287420 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.287600 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.287687 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.290089 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.298731 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b47858c59-vjxmv"] Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.323549 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8"] Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.425329 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f3df1a-e96a-4e9d-9af8-334a144d7cc4" path="/var/lib/kubelet/pods/01f3df1a-e96a-4e9d-9af8-334a144d7cc4/volumes" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.425919 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc37a72-d709-408f-b636-dd62ad023b8d" path="/var/lib/kubelet/pods/bdc37a72-d709-408f-b636-dd62ad023b8d/volumes" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450158 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-config\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450230 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-client-ca\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450285 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-config\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450312 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f80ec12-f6df-47d0-8895-29dc259de391-serving-cert\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450344 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-client-ca\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450376 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf78q\" (UniqueName: \"kubernetes.io/projected/5f80ec12-f6df-47d0-8895-29dc259de391-kube-api-access-xf78q\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450408 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sljqv\" (UniqueName: \"kubernetes.io/projected/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-kube-api-access-sljqv\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450438 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-proxy-ca-bundles\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.450461 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-serving-cert\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552254 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-config\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552339 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-client-ca\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552391 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-config\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552425 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f80ec12-f6df-47d0-8895-29dc259de391-serving-cert\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552451 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-client-ca\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552505 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf78q\" (UniqueName: \"kubernetes.io/projected/5f80ec12-f6df-47d0-8895-29dc259de391-kube-api-access-xf78q\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552543 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sljqv\" (UniqueName: \"kubernetes.io/projected/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-kube-api-access-sljqv\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552570 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-proxy-ca-bundles\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.552593 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-serving-cert\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.554189 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-client-ca\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.554367 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-client-ca\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.554712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-proxy-ca-bundles\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.554752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-config\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.554890 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f80ec12-f6df-47d0-8895-29dc259de391-config\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.560719 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-serving-cert\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.563923 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f80ec12-f6df-47d0-8895-29dc259de391-serving-cert\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.617255 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf78q\" (UniqueName: \"kubernetes.io/projected/5f80ec12-f6df-47d0-8895-29dc259de391-kube-api-access-xf78q\") pod \"controller-manager-7b47858c59-vjxmv\" (UID: \"5f80ec12-f6df-47d0-8895-29dc259de391\") " pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.629125 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sljqv\" (UniqueName: \"kubernetes.io/projected/1afadbb6-9bda-421e-9859-4afdb0e9cdfb-kube-api-access-sljqv\") pod \"route-controller-manager-7bd888bb4-lmkf8\" (UID: \"1afadbb6-9bda-421e-9859-4afdb0e9cdfb\") " pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.898973 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:21 crc kubenswrapper[4672]: I0930 12:34:21.919569 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:22 crc kubenswrapper[4672]: I0930 12:34:22.230731 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8"] Sep 30 12:34:22 crc kubenswrapper[4672]: I0930 12:34:22.457184 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b47858c59-vjxmv"] Sep 30 12:34:22 crc kubenswrapper[4672]: W0930 12:34:22.466765 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f80ec12_f6df_47d0_8895_29dc259de391.slice/crio-c5b8b3b8c2cb19901f57aaef7f571b52d799c82d97dcccc94612c9bca4f37c08 WatchSource:0}: Error finding container c5b8b3b8c2cb19901f57aaef7f571b52d799c82d97dcccc94612c9bca4f37c08: Status 404 returned error can't find the container with id c5b8b3b8c2cb19901f57aaef7f571b52d799c82d97dcccc94612c9bca4f37c08 Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.256842 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" event={"ID":"1afadbb6-9bda-421e-9859-4afdb0e9cdfb","Type":"ContainerStarted","Data":"c5feea0344dbc10f30b59cecbc16712e8aeab3a59153e6d2b47ea26591f1d427"} Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.256897 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" event={"ID":"1afadbb6-9bda-421e-9859-4afdb0e9cdfb","Type":"ContainerStarted","Data":"fdd6a480feea6a4292057b1de9fbdf27ca66055662976abc5f5b8ddd24921fec"} Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.257069 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.259026 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" event={"ID":"5f80ec12-f6df-47d0-8895-29dc259de391","Type":"ContainerStarted","Data":"a6c7214b809383b78540f7339cff7cb0dc0a0081f04d2e0b295c7693a9cfbd00"} Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.259061 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" event={"ID":"5f80ec12-f6df-47d0-8895-29dc259de391","Type":"ContainerStarted","Data":"c5b8b3b8c2cb19901f57aaef7f571b52d799c82d97dcccc94612c9bca4f37c08"} Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.259367 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.261485 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.264384 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.290836 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bd888bb4-lmkf8" podStartSLOduration=4.290814472 podStartE2EDuration="4.290814472s" podCreationTimestamp="2025-09-30 12:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:34:23.273779287 +0000 UTC m=+754.543016943" watchObservedRunningTime="2025-09-30 12:34:23.290814472 +0000 UTC m=+754.560052118" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.322850 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b47858c59-vjxmv" podStartSLOduration=4.322828329 podStartE2EDuration="4.322828329s" podCreationTimestamp="2025-09-30 12:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:34:23.320706835 +0000 UTC m=+754.589944471" watchObservedRunningTime="2025-09-30 12:34:23.322828329 +0000 UTC m=+754.592065975" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.980709 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg"] Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.981669 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.986626 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-v96q9" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.986675 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.986932 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.987217 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 12:34:23 crc kubenswrapper[4672]: I0930 12:34:23.991912 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.007076 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg"] Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.094444 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzdpm\" (UniqueName: \"kubernetes.io/projected/2831267a-a276-41c3-afaf-c262071b60c7-kube-api-access-rzdpm\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.094543 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2831267a-a276-41c3-afaf-c262071b60c7-webhook-cert\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.094579 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2831267a-a276-41c3-afaf-c262071b60c7-apiservice-cert\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.195632 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2831267a-a276-41c3-afaf-c262071b60c7-webhook-cert\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.195681 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2831267a-a276-41c3-afaf-c262071b60c7-apiservice-cert\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.195712 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzdpm\" (UniqueName: \"kubernetes.io/projected/2831267a-a276-41c3-afaf-c262071b60c7-kube-api-access-rzdpm\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.211254 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2831267a-a276-41c3-afaf-c262071b60c7-webhook-cert\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.214907 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzdpm\" (UniqueName: \"kubernetes.io/projected/2831267a-a276-41c3-afaf-c262071b60c7-kube-api-access-rzdpm\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.215853 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2831267a-a276-41c3-afaf-c262071b60c7-apiservice-cert\") pod \"metallb-operator-controller-manager-879d84ff8-vhxdg\" (UID: \"2831267a-a276-41c3-afaf-c262071b60c7\") " pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.303352 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.330241 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27"] Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.331062 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.339878 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jgvmn" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.340154 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.340241 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.355054 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27"] Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.501179 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnp2k\" (UniqueName: \"kubernetes.io/projected/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-kube-api-access-nnp2k\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.502397 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-apiservice-cert\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.502455 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-webhook-cert\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.590573 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg"] Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.605069 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnp2k\" (UniqueName: \"kubernetes.io/projected/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-kube-api-access-nnp2k\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.605169 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-apiservice-cert\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.605192 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-webhook-cert\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.612166 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-apiservice-cert\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.612487 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-webhook-cert\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.629723 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnp2k\" (UniqueName: \"kubernetes.io/projected/bbcc3167-a28b-47c0-93a5-cab38ea7d13b-kube-api-access-nnp2k\") pod \"metallb-operator-webhook-server-5dddf5dfdb-xtn27\" (UID: \"bbcc3167-a28b-47c0-93a5-cab38ea7d13b\") " pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.670716 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.739314 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.739366 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:34:24 crc kubenswrapper[4672]: I0930 12:34:24.890155 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27"] Sep 30 12:34:24 crc kubenswrapper[4672]: W0930 12:34:24.897243 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbcc3167_a28b_47c0_93a5_cab38ea7d13b.slice/crio-b2c36de5824f69755cf407d5da5b1e33a9c66c6980e4ae47aa055794058caad6 WatchSource:0}: Error finding container b2c36de5824f69755cf407d5da5b1e33a9c66c6980e4ae47aa055794058caad6: Status 404 returned error can't find the container with id b2c36de5824f69755cf407d5da5b1e33a9c66c6980e4ae47aa055794058caad6 Sep 30 12:34:25 crc kubenswrapper[4672]: I0930 12:34:25.273361 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" event={"ID":"bbcc3167-a28b-47c0-93a5-cab38ea7d13b","Type":"ContainerStarted","Data":"b2c36de5824f69755cf407d5da5b1e33a9c66c6980e4ae47aa055794058caad6"} Sep 30 12:34:25 crc kubenswrapper[4672]: I0930 12:34:25.274993 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" event={"ID":"2831267a-a276-41c3-afaf-c262071b60c7","Type":"ContainerStarted","Data":"1ede19525d32bf996a2c8f9b07edb1484ebc2ff4edcf0458cde873dbf920fc64"} Sep 30 12:34:27 crc kubenswrapper[4672]: I0930 12:34:27.885703 4672 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 12:34:28 crc kubenswrapper[4672]: I0930 12:34:28.309637 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" event={"ID":"2831267a-a276-41c3-afaf-c262071b60c7","Type":"ContainerStarted","Data":"37f1170b1e67ce17a71b6dd0afdcfae28c04b1d33f02369bca49b31bcdb70f41"} Sep 30 12:34:28 crc kubenswrapper[4672]: I0930 12:34:28.310247 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:34:28 crc kubenswrapper[4672]: I0930 12:34:28.352608 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" podStartSLOduration=2.589643058 podStartE2EDuration="5.352591634s" podCreationTimestamp="2025-09-30 12:34:23 +0000 UTC" firstStartedPulling="2025-09-30 12:34:24.604870172 +0000 UTC m=+755.874107818" lastFinishedPulling="2025-09-30 12:34:27.367818748 +0000 UTC m=+758.637056394" observedRunningTime="2025-09-30 12:34:28.349380912 +0000 UTC m=+759.618618558" watchObservedRunningTime="2025-09-30 12:34:28.352591634 +0000 UTC m=+759.621829280" Sep 30 12:34:32 crc kubenswrapper[4672]: I0930 12:34:32.344099 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" event={"ID":"bbcc3167-a28b-47c0-93a5-cab38ea7d13b","Type":"ContainerStarted","Data":"c5569f8aa6b8620f689baa7ec9aaf415d40ac4eccafeaa828cc20e247bbf8170"} Sep 30 12:34:32 crc kubenswrapper[4672]: I0930 12:34:32.344912 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:32 crc kubenswrapper[4672]: I0930 12:34:32.376392 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" podStartSLOduration=2.113162391 podStartE2EDuration="8.376374822s" podCreationTimestamp="2025-09-30 12:34:24 +0000 UTC" firstStartedPulling="2025-09-30 12:34:24.901368691 +0000 UTC m=+756.170606337" lastFinishedPulling="2025-09-30 12:34:31.164581122 +0000 UTC m=+762.433818768" observedRunningTime="2025-09-30 12:34:32.373821247 +0000 UTC m=+763.643058903" watchObservedRunningTime="2025-09-30 12:34:32.376374822 +0000 UTC m=+763.645612478" Sep 30 12:34:44 crc kubenswrapper[4672]: I0930 12:34:44.677320 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5dddf5dfdb-xtn27" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.297199 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8t79b"] Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.299164 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.310777 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8t79b"] Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.399677 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcxpt\" (UniqueName: \"kubernetes.io/projected/477eb768-e69a-428f-970c-73f3e596a9db-kube-api-access-lcxpt\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.399757 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-utilities\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.399975 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-catalog-content\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.501307 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-catalog-content\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.501407 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcxpt\" (UniqueName: \"kubernetes.io/projected/477eb768-e69a-428f-970c-73f3e596a9db-kube-api-access-lcxpt\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.501441 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-utilities\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.502219 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-catalog-content\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.502387 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-utilities\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.536691 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcxpt\" (UniqueName: \"kubernetes.io/projected/477eb768-e69a-428f-970c-73f3e596a9db-kube-api-access-lcxpt\") pod \"redhat-operators-8t79b\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:52 crc kubenswrapper[4672]: I0930 12:34:52.633793 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:34:53 crc kubenswrapper[4672]: I0930 12:34:53.081579 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8t79b"] Sep 30 12:34:53 crc kubenswrapper[4672]: I0930 12:34:53.483034 4672 generic.go:334] "Generic (PLEG): container finished" podID="477eb768-e69a-428f-970c-73f3e596a9db" containerID="7fc40a8fc35d8e1be9bf7c0915893f403396dd2221edc1d8befc53dfb1f9dfa6" exitCode=0 Sep 30 12:34:53 crc kubenswrapper[4672]: I0930 12:34:53.483134 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t79b" event={"ID":"477eb768-e69a-428f-970c-73f3e596a9db","Type":"ContainerDied","Data":"7fc40a8fc35d8e1be9bf7c0915893f403396dd2221edc1d8befc53dfb1f9dfa6"} Sep 30 12:34:53 crc kubenswrapper[4672]: I0930 12:34:53.483298 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t79b" event={"ID":"477eb768-e69a-428f-970c-73f3e596a9db","Type":"ContainerStarted","Data":"5c5d136b3e15655bd2ce1a28358c2112cafe62182552bee20fdfd15ce7e9209d"} Sep 30 12:34:54 crc kubenswrapper[4672]: I0930 12:34:54.495544 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t79b" event={"ID":"477eb768-e69a-428f-970c-73f3e596a9db","Type":"ContainerStarted","Data":"da45ace129d6ed9fae02506b44ba7617340d42e8e817bec7b1b2de9771fbd15d"} Sep 30 12:34:54 crc kubenswrapper[4672]: I0930 12:34:54.739147 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:34:54 crc kubenswrapper[4672]: I0930 12:34:54.739248 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:34:55 crc kubenswrapper[4672]: I0930 12:34:55.506633 4672 generic.go:334] "Generic (PLEG): container finished" podID="477eb768-e69a-428f-970c-73f3e596a9db" containerID="da45ace129d6ed9fae02506b44ba7617340d42e8e817bec7b1b2de9771fbd15d" exitCode=0 Sep 30 12:34:55 crc kubenswrapper[4672]: I0930 12:34:55.506685 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t79b" event={"ID":"477eb768-e69a-428f-970c-73f3e596a9db","Type":"ContainerDied","Data":"da45ace129d6ed9fae02506b44ba7617340d42e8e817bec7b1b2de9771fbd15d"} Sep 30 12:34:56 crc kubenswrapper[4672]: I0930 12:34:56.517249 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t79b" event={"ID":"477eb768-e69a-428f-970c-73f3e596a9db","Type":"ContainerStarted","Data":"6adecb4d331a89417c740b0b9e53dc20246885a1822715acd3f4ef58cf8b5d81"} Sep 30 12:34:56 crc kubenswrapper[4672]: I0930 12:34:56.542615 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8t79b" podStartSLOduration=1.978446159 podStartE2EDuration="4.54259644s" podCreationTimestamp="2025-09-30 12:34:52 +0000 UTC" firstStartedPulling="2025-09-30 12:34:53.484255046 +0000 UTC m=+784.753492692" lastFinishedPulling="2025-09-30 12:34:56.048405287 +0000 UTC m=+787.317642973" observedRunningTime="2025-09-30 12:34:56.539947373 +0000 UTC m=+787.809185029" watchObservedRunningTime="2025-09-30 12:34:56.54259644 +0000 UTC m=+787.811834086" Sep 30 12:35:02 crc kubenswrapper[4672]: I0930 12:35:02.634876 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:35:02 crc kubenswrapper[4672]: I0930 12:35:02.635293 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:35:02 crc kubenswrapper[4672]: I0930 12:35:02.672029 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:35:03 crc kubenswrapper[4672]: I0930 12:35:03.598685 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:35:03 crc kubenswrapper[4672]: I0930 12:35:03.662131 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8t79b"] Sep 30 12:35:04 crc kubenswrapper[4672]: I0930 12:35:04.307530 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-879d84ff8-vhxdg" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.203867 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zchdp"] Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.206792 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.210045 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.210119 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.212074 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-c88fd" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.214569 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj"] Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.215317 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.217328 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.239717 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj"] Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.281044 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d6349248-bcbd-486b-8143-90b66a52f017-frr-startup\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.281427 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-metrics\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.281527 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hphx7\" (UniqueName: \"kubernetes.io/projected/d6349248-bcbd-486b-8143-90b66a52f017-kube-api-access-hphx7\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.281598 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fr28\" (UniqueName: \"kubernetes.io/projected/8a1bda96-fd76-4372-bb9f-ae56e6602caf-kube-api-access-6fr28\") pod \"frr-k8s-webhook-server-5478bdb765-bpwqj\" (UID: \"8a1bda96-fd76-4372-bb9f-ae56e6602caf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.281631 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-reloader\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.281701 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a1bda96-fd76-4372-bb9f-ae56e6602caf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bpwqj\" (UID: \"8a1bda96-fd76-4372-bb9f-ae56e6602caf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.281921 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-frr-conf\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.282122 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-frr-sockets\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.282153 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6349248-bcbd-486b-8143-90b66a52f017-metrics-certs\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.322425 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-v66pl"] Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.323621 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.325046 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-9d8qs"] Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.326022 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.326761 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2l7p8" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.327063 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.327142 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.327116 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.331425 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.338892 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-9d8qs"] Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383182 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fr28\" (UniqueName: \"kubernetes.io/projected/8a1bda96-fd76-4372-bb9f-ae56e6602caf-kube-api-access-6fr28\") pod \"frr-k8s-webhook-server-5478bdb765-bpwqj\" (UID: \"8a1bda96-fd76-4372-bb9f-ae56e6602caf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383229 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-reloader\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383261 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metrics-certs\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383301 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a1bda96-fd76-4372-bb9f-ae56e6602caf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bpwqj\" (UID: \"8a1bda96-fd76-4372-bb9f-ae56e6602caf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383330 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-frr-conf\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383357 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-frr-sockets\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383371 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6349248-bcbd-486b-8143-90b66a52f017-metrics-certs\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383389 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-cert\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383415 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-metrics-certs\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383435 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d6349248-bcbd-486b-8143-90b66a52f017-frr-startup\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383454 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383472 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdgc\" (UniqueName: \"kubernetes.io/projected/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-kube-api-access-xkdgc\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383494 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllbv\" (UniqueName: \"kubernetes.io/projected/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-kube-api-access-rllbv\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383516 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-metrics\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383534 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metallb-excludel2\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.383559 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hphx7\" (UniqueName: \"kubernetes.io/projected/d6349248-bcbd-486b-8143-90b66a52f017-kube-api-access-hphx7\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.383924 4672 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.384011 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1bda96-fd76-4372-bb9f-ae56e6602caf-cert podName:8a1bda96-fd76-4372-bb9f-ae56e6602caf nodeName:}" failed. No retries permitted until 2025-09-30 12:35:05.88399007 +0000 UTC m=+797.153227716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a1bda96-fd76-4372-bb9f-ae56e6602caf-cert") pod "frr-k8s-webhook-server-5478bdb765-bpwqj" (UID: "8a1bda96-fd76-4372-bb9f-ae56e6602caf") : secret "frr-k8s-webhook-server-cert" not found Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.384224 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-frr-conf\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.384278 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-reloader\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.384627 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d6349248-bcbd-486b-8143-90b66a52f017-frr-startup\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.385440 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-frr-sockets\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.385507 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d6349248-bcbd-486b-8143-90b66a52f017-metrics\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.391100 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6349248-bcbd-486b-8143-90b66a52f017-metrics-certs\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.401818 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hphx7\" (UniqueName: \"kubernetes.io/projected/d6349248-bcbd-486b-8143-90b66a52f017-kube-api-access-hphx7\") pod \"frr-k8s-zchdp\" (UID: \"d6349248-bcbd-486b-8143-90b66a52f017\") " pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.402037 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fr28\" (UniqueName: \"kubernetes.io/projected/8a1bda96-fd76-4372-bb9f-ae56e6602caf-kube-api-access-6fr28\") pod \"frr-k8s-webhook-server-5478bdb765-bpwqj\" (UID: \"8a1bda96-fd76-4372-bb9f-ae56e6602caf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.485183 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rllbv\" (UniqueName: \"kubernetes.io/projected/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-kube-api-access-rllbv\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.485289 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metallb-excludel2\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.485351 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metrics-certs\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.485425 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-cert\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.485448 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-metrics-certs\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.485497 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.485519 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdgc\" (UniqueName: \"kubernetes.io/projected/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-kube-api-access-xkdgc\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.485670 4672 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.485759 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metrics-certs podName:07eaf50f-6d5e-4e3e-8c3d-1e28769bae68 nodeName:}" failed. No retries permitted until 2025-09-30 12:35:05.985736067 +0000 UTC m=+797.254973803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metrics-certs") pod "speaker-v66pl" (UID: "07eaf50f-6d5e-4e3e-8c3d-1e28769bae68") : secret "speaker-certs-secret" not found Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.486215 4672 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.486252 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-metrics-certs podName:56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1 nodeName:}" failed. No retries permitted until 2025-09-30 12:35:05.98624302 +0000 UTC m=+797.255480666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-metrics-certs") pod "controller-5d688f5ffc-9d8qs" (UID: "56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1") : secret "controller-certs-secret" not found Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.486347 4672 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.486417 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist podName:07eaf50f-6d5e-4e3e-8c3d-1e28769bae68 nodeName:}" failed. No retries permitted until 2025-09-30 12:35:05.986401294 +0000 UTC m=+797.255638940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist") pod "speaker-v66pl" (UID: "07eaf50f-6d5e-4e3e-8c3d-1e28769bae68") : secret "metallb-memberlist" not found Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.486781 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metallb-excludel2\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.489121 4672 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.501880 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-cert\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.506923 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdgc\" (UniqueName: \"kubernetes.io/projected/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-kube-api-access-xkdgc\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.507732 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllbv\" (UniqueName: \"kubernetes.io/projected/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-kube-api-access-rllbv\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.525978 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.573916 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8t79b" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="registry-server" containerID="cri-o://6adecb4d331a89417c740b0b9e53dc20246885a1822715acd3f4ef58cf8b5d81" gracePeriod=2 Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.891593 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a1bda96-fd76-4372-bb9f-ae56e6602caf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bpwqj\" (UID: \"8a1bda96-fd76-4372-bb9f-ae56e6602caf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.896944 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a1bda96-fd76-4372-bb9f-ae56e6602caf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bpwqj\" (UID: \"8a1bda96-fd76-4372-bb9f-ae56e6602caf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.993386 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-metrics-certs\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.993515 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.993661 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metrics-certs\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.993731 4672 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 12:35:05 crc kubenswrapper[4672]: E0930 12:35:05.993836 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist podName:07eaf50f-6d5e-4e3e-8c3d-1e28769bae68 nodeName:}" failed. No retries permitted until 2025-09-30 12:35:06.993812695 +0000 UTC m=+798.263050361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist") pod "speaker-v66pl" (UID: "07eaf50f-6d5e-4e3e-8c3d-1e28769bae68") : secret "metallb-memberlist" not found Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.997986 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1-metrics-certs\") pod \"controller-5d688f5ffc-9d8qs\" (UID: \"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1\") " pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:05 crc kubenswrapper[4672]: I0930 12:35:05.998200 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-metrics-certs\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.133207 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.248496 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.583312 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerStarted","Data":"a3f16a596b969233e544646c7f3592b3213fe027fb2cf1222ffaf830ff1295f2"} Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.586638 4672 generic.go:334] "Generic (PLEG): container finished" podID="477eb768-e69a-428f-970c-73f3e596a9db" containerID="6adecb4d331a89417c740b0b9e53dc20246885a1822715acd3f4ef58cf8b5d81" exitCode=0 Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.586680 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t79b" event={"ID":"477eb768-e69a-428f-970c-73f3e596a9db","Type":"ContainerDied","Data":"6adecb4d331a89417c740b0b9e53dc20246885a1822715acd3f4ef58cf8b5d81"} Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.609475 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj"] Sep 30 12:35:06 crc kubenswrapper[4672]: W0930 12:35:06.619688 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a1bda96_fd76_4372_bb9f_ae56e6602caf.slice/crio-060063a3094e3f3385e23831ea644fbb1b0be0bc5284829c97c90b541ab4f2a4 WatchSource:0}: Error finding container 060063a3094e3f3385e23831ea644fbb1b0be0bc5284829c97c90b541ab4f2a4: Status 404 returned error can't find the container with id 060063a3094e3f3385e23831ea644fbb1b0be0bc5284829c97c90b541ab4f2a4 Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.714348 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-9d8qs"] Sep 30 12:35:06 crc kubenswrapper[4672]: I0930 12:35:06.859255 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.008233 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-catalog-content\") pod \"477eb768-e69a-428f-970c-73f3e596a9db\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.008388 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcxpt\" (UniqueName: \"kubernetes.io/projected/477eb768-e69a-428f-970c-73f3e596a9db-kube-api-access-lcxpt\") pod \"477eb768-e69a-428f-970c-73f3e596a9db\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.008518 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-utilities\") pod \"477eb768-e69a-428f-970c-73f3e596a9db\" (UID: \"477eb768-e69a-428f-970c-73f3e596a9db\") " Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.008954 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.009298 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-utilities" (OuterVolumeSpecName: "utilities") pod "477eb768-e69a-428f-970c-73f3e596a9db" (UID: "477eb768-e69a-428f-970c-73f3e596a9db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.009637 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.014351 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477eb768-e69a-428f-970c-73f3e596a9db-kube-api-access-lcxpt" (OuterVolumeSpecName: "kube-api-access-lcxpt") pod "477eb768-e69a-428f-970c-73f3e596a9db" (UID: "477eb768-e69a-428f-970c-73f3e596a9db"). InnerVolumeSpecName "kube-api-access-lcxpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.014914 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07eaf50f-6d5e-4e3e-8c3d-1e28769bae68-memberlist\") pod \"speaker-v66pl\" (UID: \"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68\") " pod="metallb-system/speaker-v66pl" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.093033 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "477eb768-e69a-428f-970c-73f3e596a9db" (UID: "477eb768-e69a-428f-970c-73f3e596a9db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.111178 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477eb768-e69a-428f-970c-73f3e596a9db-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.111246 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcxpt\" (UniqueName: \"kubernetes.io/projected/477eb768-e69a-428f-970c-73f3e596a9db-kube-api-access-lcxpt\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.140529 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v66pl" Sep 30 12:35:07 crc kubenswrapper[4672]: W0930 12:35:07.157850 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07eaf50f_6d5e_4e3e_8c3d_1e28769bae68.slice/crio-d503d7c20d002ba5b5fbf39b47da7f4945832a665f70f6aced4bd8f02dd4c419 WatchSource:0}: Error finding container d503d7c20d002ba5b5fbf39b47da7f4945832a665f70f6aced4bd8f02dd4c419: Status 404 returned error can't find the container with id d503d7c20d002ba5b5fbf39b47da7f4945832a665f70f6aced4bd8f02dd4c419 Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.597166 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-9d8qs" event={"ID":"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1","Type":"ContainerStarted","Data":"b71affd925a4df8c7e6ac48159b3945f0fde01675323e5789077bf1b1d059de1"} Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.597235 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-9d8qs" event={"ID":"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1","Type":"ContainerStarted","Data":"8d44868d26622f9a7cb7feaa18ac702a3336a1530025a133aa7175e42435d44d"} Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.597255 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-9d8qs" event={"ID":"56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1","Type":"ContainerStarted","Data":"490cb9548e95751beaa14e69ca0f5ee4fd72dc4a067fcffd6ebbaf6e7fc1116e"} Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.597515 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.600899 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t79b" event={"ID":"477eb768-e69a-428f-970c-73f3e596a9db","Type":"ContainerDied","Data":"5c5d136b3e15655bd2ce1a28358c2112cafe62182552bee20fdfd15ce7e9209d"} Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.600958 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t79b" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.600965 4672 scope.go:117] "RemoveContainer" containerID="6adecb4d331a89417c740b0b9e53dc20246885a1822715acd3f4ef58cf8b5d81" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.605609 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v66pl" event={"ID":"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68","Type":"ContainerStarted","Data":"5a2535ae47c1b25e7005b970e4f87f914b2785f73b7af285136b8b30a1f8e66c"} Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.605667 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v66pl" event={"ID":"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68","Type":"ContainerStarted","Data":"d503d7c20d002ba5b5fbf39b47da7f4945832a665f70f6aced4bd8f02dd4c419"} Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.612193 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" event={"ID":"8a1bda96-fd76-4372-bb9f-ae56e6602caf","Type":"ContainerStarted","Data":"060063a3094e3f3385e23831ea644fbb1b0be0bc5284829c97c90b541ab4f2a4"} Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.623062 4672 scope.go:117] "RemoveContainer" containerID="da45ace129d6ed9fae02506b44ba7617340d42e8e817bec7b1b2de9771fbd15d" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.627673 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-9d8qs" podStartSLOduration=2.627649399 podStartE2EDuration="2.627649399s" podCreationTimestamp="2025-09-30 12:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:35:07.625369221 +0000 UTC m=+798.894606867" watchObservedRunningTime="2025-09-30 12:35:07.627649399 +0000 UTC m=+798.896887045" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.644130 4672 scope.go:117] "RemoveContainer" containerID="7fc40a8fc35d8e1be9bf7c0915893f403396dd2221edc1d8befc53dfb1f9dfa6" Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.671535 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8t79b"] Sep 30 12:35:07 crc kubenswrapper[4672]: I0930 12:35:07.679722 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8t79b"] Sep 30 12:35:08 crc kubenswrapper[4672]: I0930 12:35:08.621722 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v66pl" event={"ID":"07eaf50f-6d5e-4e3e-8c3d-1e28769bae68","Type":"ContainerStarted","Data":"434327d2c82a7be451b7c6dc8dc1b862444c0052f435280e0205fb20611d9a0b"} Sep 30 12:35:08 crc kubenswrapper[4672]: I0930 12:35:08.623241 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-v66pl" Sep 30 12:35:08 crc kubenswrapper[4672]: I0930 12:35:08.666301 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-v66pl" podStartSLOduration=3.6662539499999998 podStartE2EDuration="3.66625395s" podCreationTimestamp="2025-09-30 12:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:35:08.664742871 +0000 UTC m=+799.933980517" watchObservedRunningTime="2025-09-30 12:35:08.66625395 +0000 UTC m=+799.935491596" Sep 30 12:35:09 crc kubenswrapper[4672]: I0930 12:35:09.428630 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477eb768-e69a-428f-970c-73f3e596a9db" path="/var/lib/kubelet/pods/477eb768-e69a-428f-970c-73f3e596a9db/volumes" Sep 30 12:35:14 crc kubenswrapper[4672]: I0930 12:35:14.683477 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" event={"ID":"8a1bda96-fd76-4372-bb9f-ae56e6602caf","Type":"ContainerStarted","Data":"a0640d8305c74b91f32607229e13a58e2cd91b6a0ccf65455503e0fe0a8f143f"} Sep 30 12:35:14 crc kubenswrapper[4672]: I0930 12:35:14.684008 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:14 crc kubenswrapper[4672]: I0930 12:35:14.687224 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6349248-bcbd-486b-8143-90b66a52f017" containerID="8d0674292c37ba54c21337c19b768ffd3ddb30bc733632cb52e4b21a79fc1723" exitCode=0 Sep 30 12:35:14 crc kubenswrapper[4672]: I0930 12:35:14.687325 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerDied","Data":"8d0674292c37ba54c21337c19b768ffd3ddb30bc733632cb52e4b21a79fc1723"} Sep 30 12:35:14 crc kubenswrapper[4672]: I0930 12:35:14.700062 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" podStartSLOduration=2.727150649 podStartE2EDuration="9.700043964s" podCreationTimestamp="2025-09-30 12:35:05 +0000 UTC" firstStartedPulling="2025-09-30 12:35:06.622070572 +0000 UTC m=+797.891308218" lastFinishedPulling="2025-09-30 12:35:13.594963887 +0000 UTC m=+804.864201533" observedRunningTime="2025-09-30 12:35:14.697234562 +0000 UTC m=+805.966472208" watchObservedRunningTime="2025-09-30 12:35:14.700043964 +0000 UTC m=+805.969281610" Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.695112 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6349248-bcbd-486b-8143-90b66a52f017" containerID="c1be2ba872591acd25d56eea04d901cfbb0a06cfb92a3bab04c6c44e520e3a29" exitCode=0 Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.695243 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerDied","Data":"c1be2ba872591acd25d56eea04d901cfbb0a06cfb92a3bab04c6c44e520e3a29"} Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.981016 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d7pqb"] Sep 30 12:35:15 crc kubenswrapper[4672]: E0930 12:35:15.981651 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="extract-content" Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.981669 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="extract-content" Sep 30 12:35:15 crc kubenswrapper[4672]: E0930 12:35:15.981685 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="extract-utilities" Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.981694 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="extract-utilities" Sep 30 12:35:15 crc kubenswrapper[4672]: E0930 12:35:15.981718 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="registry-server" Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.981730 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="registry-server" Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.981875 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="477eb768-e69a-428f-970c-73f3e596a9db" containerName="registry-server" Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.982936 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:15 crc kubenswrapper[4672]: I0930 12:35:15.991321 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7pqb"] Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.093340 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-catalog-content\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.093394 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-utilities\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.093973 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq77c\" (UniqueName: \"kubernetes.io/projected/b996c085-e67e-46e5-8a95-e1f0593c95c9-kube-api-access-jq77c\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.194957 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq77c\" (UniqueName: \"kubernetes.io/projected/b996c085-e67e-46e5-8a95-e1f0593c95c9-kube-api-access-jq77c\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.195037 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-catalog-content\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.195060 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-utilities\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.195512 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-catalog-content\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.195580 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-utilities\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.217085 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq77c\" (UniqueName: \"kubernetes.io/projected/b996c085-e67e-46e5-8a95-e1f0593c95c9-kube-api-access-jq77c\") pod \"certified-operators-d7pqb\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.252983 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-9d8qs" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.302720 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.620100 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7pqb"] Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.703730 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6349248-bcbd-486b-8143-90b66a52f017" containerID="da5452742f9f4ef297c358e7ba225e07445b6cc95b3845a2b4abb9a28c12616d" exitCode=0 Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.703837 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerDied","Data":"da5452742f9f4ef297c358e7ba225e07445b6cc95b3845a2b4abb9a28c12616d"} Sep 30 12:35:16 crc kubenswrapper[4672]: I0930 12:35:16.705937 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pqb" event={"ID":"b996c085-e67e-46e5-8a95-e1f0593c95c9","Type":"ContainerStarted","Data":"e530cb07c601387e746194c96f8cdcfe16c0ff527afcbc7d0a76b4ad86442e04"} Sep 30 12:35:17 crc kubenswrapper[4672]: I0930 12:35:17.144814 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-v66pl" Sep 30 12:35:17 crc kubenswrapper[4672]: I0930 12:35:17.722294 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerStarted","Data":"ac1c0c746fdb55c58ba9f9e6444ac9062699edee9b4b8d80fab3151c9d2c4f26"} Sep 30 12:35:17 crc kubenswrapper[4672]: I0930 12:35:17.722696 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerStarted","Data":"1147561c0d7be5c4b96659fc0966b6615f2c75902bca34c901b730a0c968ddff"} Sep 30 12:35:17 crc kubenswrapper[4672]: I0930 12:35:17.722717 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerStarted","Data":"ea2c7f93b43f32655a219b6375c7e86513fad82784438c24e66129782fa2dd04"} Sep 30 12:35:17 crc kubenswrapper[4672]: I0930 12:35:17.724022 4672 generic.go:334] "Generic (PLEG): container finished" podID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerID="1450b132ec1edd45581b2bf9f7b680df69696a60cff00d1ed84c746593ee0998" exitCode=0 Sep 30 12:35:17 crc kubenswrapper[4672]: I0930 12:35:17.724060 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pqb" event={"ID":"b996c085-e67e-46e5-8a95-e1f0593c95c9","Type":"ContainerDied","Data":"1450b132ec1edd45581b2bf9f7b680df69696a60cff00d1ed84c746593ee0998"} Sep 30 12:35:18 crc kubenswrapper[4672]: I0930 12:35:18.736705 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerStarted","Data":"3ed98a8d3df0841252ebda04144d1d4dc327a342b6a4b15c12b85cbef2be8517"} Sep 30 12:35:18 crc kubenswrapper[4672]: I0930 12:35:18.737091 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:18 crc kubenswrapper[4672]: I0930 12:35:18.737107 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerStarted","Data":"06cf284e209435a71aff281e44c51b0653a4a613cfeef006637a855425e331b2"} Sep 30 12:35:18 crc kubenswrapper[4672]: I0930 12:35:18.737120 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zchdp" event={"ID":"d6349248-bcbd-486b-8143-90b66a52f017","Type":"ContainerStarted","Data":"c64d7d6147a3055b28a7154a26666073f838d2a5db5a324250dedb44233fd3fa"} Sep 30 12:35:18 crc kubenswrapper[4672]: I0930 12:35:18.739126 4672 generic.go:334] "Generic (PLEG): container finished" podID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerID="815615471b897e6ab0b76a88e5838ebba2e671e37962f0e509573799f927f092" exitCode=0 Sep 30 12:35:18 crc kubenswrapper[4672]: I0930 12:35:18.739165 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pqb" event={"ID":"b996c085-e67e-46e5-8a95-e1f0593c95c9","Type":"ContainerDied","Data":"815615471b897e6ab0b76a88e5838ebba2e671e37962f0e509573799f927f092"} Sep 30 12:35:18 crc kubenswrapper[4672]: I0930 12:35:18.764809 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zchdp" podStartSLOduration=6.377384032 podStartE2EDuration="13.764771797s" podCreationTimestamp="2025-09-30 12:35:05 +0000 UTC" firstStartedPulling="2025-09-30 12:35:06.229593534 +0000 UTC m=+797.498831180" lastFinishedPulling="2025-09-30 12:35:13.616981299 +0000 UTC m=+804.886218945" observedRunningTime="2025-09-30 12:35:18.758020965 +0000 UTC m=+810.027258621" watchObservedRunningTime="2025-09-30 12:35:18.764771797 +0000 UTC m=+810.034009443" Sep 30 12:35:19 crc kubenswrapper[4672]: I0930 12:35:19.758735 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pqb" event={"ID":"b996c085-e67e-46e5-8a95-e1f0593c95c9","Type":"ContainerStarted","Data":"14593801613d30e008432735410c14abde99cd5e5bac3c74fef5eb1c22d12424"} Sep 30 12:35:19 crc kubenswrapper[4672]: I0930 12:35:19.779968 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d7pqb" podStartSLOduration=3.3206315809999998 podStartE2EDuration="4.7799475s" podCreationTimestamp="2025-09-30 12:35:15 +0000 UTC" firstStartedPulling="2025-09-30 12:35:17.726877575 +0000 UTC m=+808.996115221" lastFinishedPulling="2025-09-30 12:35:19.186193494 +0000 UTC m=+810.455431140" observedRunningTime="2025-09-30 12:35:19.775531947 +0000 UTC m=+811.044769603" watchObservedRunningTime="2025-09-30 12:35:19.7799475 +0000 UTC m=+811.049185146" Sep 30 12:35:20 crc kubenswrapper[4672]: I0930 12:35:20.527016 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:20 crc kubenswrapper[4672]: I0930 12:35:20.566611 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.584926 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6rjvm"] Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.586710 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.589004 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.590899 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fq7rk" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.591165 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.598721 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6rjvm"] Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.610967 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jcc\" (UniqueName: \"kubernetes.io/projected/63d18a29-8d30-437c-af1b-f9cd9fa99b6b-kube-api-access-26jcc\") pod \"openstack-operator-index-6rjvm\" (UID: \"63d18a29-8d30-437c-af1b-f9cd9fa99b6b\") " pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.712008 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jcc\" (UniqueName: \"kubernetes.io/projected/63d18a29-8d30-437c-af1b-f9cd9fa99b6b-kube-api-access-26jcc\") pod \"openstack-operator-index-6rjvm\" (UID: \"63d18a29-8d30-437c-af1b-f9cd9fa99b6b\") " pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.731966 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jcc\" (UniqueName: \"kubernetes.io/projected/63d18a29-8d30-437c-af1b-f9cd9fa99b6b-kube-api-access-26jcc\") pod \"openstack-operator-index-6rjvm\" (UID: \"63d18a29-8d30-437c-af1b-f9cd9fa99b6b\") " pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:23 crc kubenswrapper[4672]: I0930 12:35:23.913095 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:24 crc kubenswrapper[4672]: I0930 12:35:24.361167 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6rjvm"] Sep 30 12:35:24 crc kubenswrapper[4672]: I0930 12:35:24.739742 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:35:24 crc kubenswrapper[4672]: I0930 12:35:24.739827 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:35:24 crc kubenswrapper[4672]: I0930 12:35:24.739896 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:35:24 crc kubenswrapper[4672]: I0930 12:35:24.740702 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f277ec5275d8b860ce8dcb4c0f3ecdd12eaede7bb4ef094f520236f925a1f1a1"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:35:24 crc kubenswrapper[4672]: I0930 12:35:24.740825 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://f277ec5275d8b860ce8dcb4c0f3ecdd12eaede7bb4ef094f520236f925a1f1a1" gracePeriod=600 Sep 30 12:35:24 crc kubenswrapper[4672]: I0930 12:35:24.798609 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6rjvm" event={"ID":"63d18a29-8d30-437c-af1b-f9cd9fa99b6b","Type":"ContainerStarted","Data":"234e7746f1b797b9f5025b4865d740f17efa7527d44e8c472fa80e808f376b5b"} Sep 30 12:35:25 crc kubenswrapper[4672]: I0930 12:35:25.828952 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="f277ec5275d8b860ce8dcb4c0f3ecdd12eaede7bb4ef094f520236f925a1f1a1" exitCode=0 Sep 30 12:35:25 crc kubenswrapper[4672]: I0930 12:35:25.829116 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"f277ec5275d8b860ce8dcb4c0f3ecdd12eaede7bb4ef094f520236f925a1f1a1"} Sep 30 12:35:25 crc kubenswrapper[4672]: I0930 12:35:25.829388 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"1712933e94420da648f449968b73ced3cfbd2790d2d92518ca79624030de9f70"} Sep 30 12:35:25 crc kubenswrapper[4672]: I0930 12:35:25.829415 4672 scope.go:117] "RemoveContainer" containerID="24ffe413e2febc5a1c43c44675bb37e97907e267680c76a2a11205a06222f9b4" Sep 30 12:35:26 crc kubenswrapper[4672]: I0930 12:35:26.141031 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bpwqj" Sep 30 12:35:26 crc kubenswrapper[4672]: I0930 12:35:26.303852 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:26 crc kubenswrapper[4672]: I0930 12:35:26.303985 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:26 crc kubenswrapper[4672]: I0930 12:35:26.360249 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:26 crc kubenswrapper[4672]: I0930 12:35:26.878092 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:27 crc kubenswrapper[4672]: I0930 12:35:27.845985 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6rjvm" event={"ID":"63d18a29-8d30-437c-af1b-f9cd9fa99b6b","Type":"ContainerStarted","Data":"586a2c1afe38c5219912a569b245fd5cb282912cc3b1b670ca8b623d58c353c2"} Sep 30 12:35:27 crc kubenswrapper[4672]: I0930 12:35:27.862764 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6rjvm" podStartSLOduration=2.534956699 podStartE2EDuration="4.862740166s" podCreationTimestamp="2025-09-30 12:35:23 +0000 UTC" firstStartedPulling="2025-09-30 12:35:24.369631213 +0000 UTC m=+815.638868879" lastFinishedPulling="2025-09-30 12:35:26.69741469 +0000 UTC m=+817.966652346" observedRunningTime="2025-09-30 12:35:27.860835067 +0000 UTC m=+819.130072723" watchObservedRunningTime="2025-09-30 12:35:27.862740166 +0000 UTC m=+819.131977822" Sep 30 12:35:28 crc kubenswrapper[4672]: I0930 12:35:28.781840 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cqjrd"] Sep 30 12:35:28 crc kubenswrapper[4672]: I0930 12:35:28.784434 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:28 crc kubenswrapper[4672]: I0930 12:35:28.807724 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqjrd"] Sep 30 12:35:28 crc kubenswrapper[4672]: I0930 12:35:28.912284 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-catalog-content\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:28 crc kubenswrapper[4672]: I0930 12:35:28.912620 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kvt9\" (UniqueName: \"kubernetes.io/projected/c17cb96a-dd42-49e2-9994-774451dabe25-kube-api-access-5kvt9\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:28 crc kubenswrapper[4672]: I0930 12:35:28.912862 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-utilities\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.014857 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-catalog-content\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.014963 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kvt9\" (UniqueName: \"kubernetes.io/projected/c17cb96a-dd42-49e2-9994-774451dabe25-kube-api-access-5kvt9\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.015014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-utilities\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.015616 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-catalog-content\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.015665 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-utilities\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.040683 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kvt9\" (UniqueName: \"kubernetes.io/projected/c17cb96a-dd42-49e2-9994-774451dabe25-kube-api-access-5kvt9\") pod \"redhat-marketplace-cqjrd\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.137139 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.603410 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqjrd"] Sep 30 12:35:29 crc kubenswrapper[4672]: W0930 12:35:29.609647 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc17cb96a_dd42_49e2_9994_774451dabe25.slice/crio-298d240cb743f9e9542939a1baa62b54a0c01c454443aafbe1f6ea258efa64ed WatchSource:0}: Error finding container 298d240cb743f9e9542939a1baa62b54a0c01c454443aafbe1f6ea258efa64ed: Status 404 returned error can't find the container with id 298d240cb743f9e9542939a1baa62b54a0c01c454443aafbe1f6ea258efa64ed Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.861441 4672 generic.go:334] "Generic (PLEG): container finished" podID="c17cb96a-dd42-49e2-9994-774451dabe25" containerID="96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d" exitCode=0 Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.861484 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqjrd" event={"ID":"c17cb96a-dd42-49e2-9994-774451dabe25","Type":"ContainerDied","Data":"96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d"} Sep 30 12:35:29 crc kubenswrapper[4672]: I0930 12:35:29.861510 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqjrd" event={"ID":"c17cb96a-dd42-49e2-9994-774451dabe25","Type":"ContainerStarted","Data":"298d240cb743f9e9542939a1baa62b54a0c01c454443aafbe1f6ea258efa64ed"} Sep 30 12:35:30 crc kubenswrapper[4672]: I0930 12:35:30.870085 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqjrd" event={"ID":"c17cb96a-dd42-49e2-9994-774451dabe25","Type":"ContainerStarted","Data":"e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4"} Sep 30 12:35:31 crc kubenswrapper[4672]: I0930 12:35:31.880789 4672 generic.go:334] "Generic (PLEG): container finished" podID="c17cb96a-dd42-49e2-9994-774451dabe25" containerID="e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4" exitCode=0 Sep 30 12:35:31 crc kubenswrapper[4672]: I0930 12:35:31.880865 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqjrd" event={"ID":"c17cb96a-dd42-49e2-9994-774451dabe25","Type":"ContainerDied","Data":"e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4"} Sep 30 12:35:32 crc kubenswrapper[4672]: I0930 12:35:32.376026 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7pqb"] Sep 30 12:35:32 crc kubenswrapper[4672]: I0930 12:35:32.376645 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d7pqb" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="registry-server" containerID="cri-o://14593801613d30e008432735410c14abde99cd5e5bac3c74fef5eb1c22d12424" gracePeriod=2 Sep 30 12:35:32 crc kubenswrapper[4672]: I0930 12:35:32.888558 4672 generic.go:334] "Generic (PLEG): container finished" podID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerID="14593801613d30e008432735410c14abde99cd5e5bac3c74fef5eb1c22d12424" exitCode=0 Sep 30 12:35:32 crc kubenswrapper[4672]: I0930 12:35:32.888617 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pqb" event={"ID":"b996c085-e67e-46e5-8a95-e1f0593c95c9","Type":"ContainerDied","Data":"14593801613d30e008432735410c14abde99cd5e5bac3c74fef5eb1c22d12424"} Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.446842 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.616149 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-catalog-content\") pod \"b996c085-e67e-46e5-8a95-e1f0593c95c9\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.616666 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq77c\" (UniqueName: \"kubernetes.io/projected/b996c085-e67e-46e5-8a95-e1f0593c95c9-kube-api-access-jq77c\") pod \"b996c085-e67e-46e5-8a95-e1f0593c95c9\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.616744 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-utilities\") pod \"b996c085-e67e-46e5-8a95-e1f0593c95c9\" (UID: \"b996c085-e67e-46e5-8a95-e1f0593c95c9\") " Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.617457 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-utilities" (OuterVolumeSpecName: "utilities") pod "b996c085-e67e-46e5-8a95-e1f0593c95c9" (UID: "b996c085-e67e-46e5-8a95-e1f0593c95c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.623466 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b996c085-e67e-46e5-8a95-e1f0593c95c9-kube-api-access-jq77c" (OuterVolumeSpecName: "kube-api-access-jq77c") pod "b996c085-e67e-46e5-8a95-e1f0593c95c9" (UID: "b996c085-e67e-46e5-8a95-e1f0593c95c9"). InnerVolumeSpecName "kube-api-access-jq77c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.661830 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b996c085-e67e-46e5-8a95-e1f0593c95c9" (UID: "b996c085-e67e-46e5-8a95-e1f0593c95c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.718632 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.718668 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b996c085-e67e-46e5-8a95-e1f0593c95c9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.718690 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq77c\" (UniqueName: \"kubernetes.io/projected/b996c085-e67e-46e5-8a95-e1f0593c95c9-kube-api-access-jq77c\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.896994 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqjrd" event={"ID":"c17cb96a-dd42-49e2-9994-774451dabe25","Type":"ContainerStarted","Data":"d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69"} Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.900353 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7pqb" event={"ID":"b996c085-e67e-46e5-8a95-e1f0593c95c9","Type":"ContainerDied","Data":"e530cb07c601387e746194c96f8cdcfe16c0ff527afcbc7d0a76b4ad86442e04"} Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.900421 4672 scope.go:117] "RemoveContainer" containerID="14593801613d30e008432735410c14abde99cd5e5bac3c74fef5eb1c22d12424" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.900591 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7pqb" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.915307 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.916070 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.924440 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cqjrd" podStartSLOduration=2.623998948 podStartE2EDuration="5.924422572s" podCreationTimestamp="2025-09-30 12:35:28 +0000 UTC" firstStartedPulling="2025-09-30 12:35:29.867136598 +0000 UTC m=+821.136374244" lastFinishedPulling="2025-09-30 12:35:33.167560212 +0000 UTC m=+824.436797868" observedRunningTime="2025-09-30 12:35:33.920859691 +0000 UTC m=+825.190097337" watchObservedRunningTime="2025-09-30 12:35:33.924422572 +0000 UTC m=+825.193660218" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.925156 4672 scope.go:117] "RemoveContainer" containerID="815615471b897e6ab0b76a88e5838ebba2e671e37962f0e509573799f927f092" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.935072 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7pqb"] Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.944103 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d7pqb"] Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.963950 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:33 crc kubenswrapper[4672]: I0930 12:35:33.966319 4672 scope.go:117] "RemoveContainer" containerID="1450b132ec1edd45581b2bf9f7b680df69696a60cff00d1ed84c746593ee0998" Sep 30 12:35:34 crc kubenswrapper[4672]: I0930 12:35:34.939989 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6rjvm" Sep 30 12:35:35 crc kubenswrapper[4672]: I0930 12:35:35.428786 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" path="/var/lib/kubelet/pods/b996c085-e67e-46e5-8a95-e1f0593c95c9/volumes" Sep 30 12:35:35 crc kubenswrapper[4672]: I0930 12:35:35.531330 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zchdp" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.417064 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9"] Sep 30 12:35:38 crc kubenswrapper[4672]: E0930 12:35:38.418003 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="extract-utilities" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.418025 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="extract-utilities" Sep 30 12:35:38 crc kubenswrapper[4672]: E0930 12:35:38.418062 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="extract-content" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.418071 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="extract-content" Sep 30 12:35:38 crc kubenswrapper[4672]: E0930 12:35:38.418088 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="registry-server" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.418097 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="registry-server" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.418297 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b996c085-e67e-46e5-8a95-e1f0593c95c9" containerName="registry-server" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.419645 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.422733 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kg8k9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.430235 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9"] Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.593282 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-bundle\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.593787 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-util\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.593996 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xwfl\" (UniqueName: \"kubernetes.io/projected/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-kube-api-access-2xwfl\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.695762 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xwfl\" (UniqueName: \"kubernetes.io/projected/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-kube-api-access-2xwfl\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.695824 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-bundle\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.695876 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-util\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.696369 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-util\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.696377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-bundle\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.735200 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xwfl\" (UniqueName: \"kubernetes.io/projected/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-kube-api-access-2xwfl\") pod \"3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:38 crc kubenswrapper[4672]: I0930 12:35:38.744247 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.137709 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.138074 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.178954 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.188484 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9"] Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.945189 4672 generic.go:334] "Generic (PLEG): container finished" podID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerID="7afe336fa1ab311b18e123c112305b1ec9b15416d8abc3c59006d0fb8061fab5" exitCode=0 Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.945256 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" event={"ID":"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9","Type":"ContainerDied","Data":"7afe336fa1ab311b18e123c112305b1ec9b15416d8abc3c59006d0fb8061fab5"} Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.945657 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" event={"ID":"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9","Type":"ContainerStarted","Data":"8b15bca05957275f03b7c42cd7b09902d393782156e531e3fa4bb61bd876747a"} Sep 30 12:35:39 crc kubenswrapper[4672]: I0930 12:35:39.996592 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:40 crc kubenswrapper[4672]: I0930 12:35:40.953443 4672 generic.go:334] "Generic (PLEG): container finished" podID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerID="c56ae65a944fa9f683bd4877a1afd41a9d85242bb5d6b3ca2f9fd3e838e13454" exitCode=0 Sep 30 12:35:40 crc kubenswrapper[4672]: I0930 12:35:40.953544 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" event={"ID":"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9","Type":"ContainerDied","Data":"c56ae65a944fa9f683bd4877a1afd41a9d85242bb5d6b3ca2f9fd3e838e13454"} Sep 30 12:35:41 crc kubenswrapper[4672]: I0930 12:35:41.961445 4672 generic.go:334] "Generic (PLEG): container finished" podID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerID="db536c55edb031d1942eb91a5271bf4586674a82d5f249ead4e827949ece97e6" exitCode=0 Sep 30 12:35:41 crc kubenswrapper[4672]: I0930 12:35:41.961531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" event={"ID":"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9","Type":"ContainerDied","Data":"db536c55edb031d1942eb91a5271bf4586674a82d5f249ead4e827949ece97e6"} Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.311975 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.475064 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-bundle\") pod \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.475162 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-util\") pod \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.475291 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xwfl\" (UniqueName: \"kubernetes.io/projected/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-kube-api-access-2xwfl\") pod \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\" (UID: \"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9\") " Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.476507 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-bundle" (OuterVolumeSpecName: "bundle") pod "47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" (UID: "47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.481023 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-kube-api-access-2xwfl" (OuterVolumeSpecName: "kube-api-access-2xwfl") pod "47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" (UID: "47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9"). InnerVolumeSpecName "kube-api-access-2xwfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.489248 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-util" (OuterVolumeSpecName: "util") pod "47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" (UID: "47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.576113 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p6jvp"] Sep 30 12:35:43 crc kubenswrapper[4672]: E0930 12:35:43.576479 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerName="util" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.576495 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerName="util" Sep 30 12:35:43 crc kubenswrapper[4672]: E0930 12:35:43.576505 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerName="extract" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.576512 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerName="extract" Sep 30 12:35:43 crc kubenswrapper[4672]: E0930 12:35:43.576523 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerName="pull" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.576531 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerName="pull" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.576678 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9" containerName="extract" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.577569 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xwfl\" (UniqueName: \"kubernetes.io/projected/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-kube-api-access-2xwfl\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.577597 4672 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.577609 4672 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9-util\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.577787 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.586763 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6jvp"] Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.678760 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-catalog-content\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.678837 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-utilities\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.678891 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/a412860a-b174-46a8-82a8-9e067b45f055-kube-api-access-njwnj\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.781538 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-catalog-content\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.781885 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-utilities\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.781988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/a412860a-b174-46a8-82a8-9e067b45f055-kube-api-access-njwnj\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.782298 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-catalog-content\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.782559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-utilities\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.801880 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/a412860a-b174-46a8-82a8-9e067b45f055-kube-api-access-njwnj\") pod \"community-operators-p6jvp\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.909557 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.987314 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" event={"ID":"47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9","Type":"ContainerDied","Data":"8b15bca05957275f03b7c42cd7b09902d393782156e531e3fa4bb61bd876747a"} Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.987362 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b15bca05957275f03b7c42cd7b09902d393782156e531e3fa4bb61bd876747a" Sep 30 12:35:43 crc kubenswrapper[4672]: I0930 12:35:43.987407 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9" Sep 30 12:35:44 crc kubenswrapper[4672]: W0930 12:35:44.391154 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda412860a_b174_46a8_82a8_9e067b45f055.slice/crio-0556b7be248949c0ee1be0553e74a660f6c9e77f4d1dce3633fcf9e5d94d4193 WatchSource:0}: Error finding container 0556b7be248949c0ee1be0553e74a660f6c9e77f4d1dce3633fcf9e5d94d4193: Status 404 returned error can't find the container with id 0556b7be248949c0ee1be0553e74a660f6c9e77f4d1dce3633fcf9e5d94d4193 Sep 30 12:35:44 crc kubenswrapper[4672]: I0930 12:35:44.391521 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6jvp"] Sep 30 12:35:44 crc kubenswrapper[4672]: I0930 12:35:44.565739 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqjrd"] Sep 30 12:35:44 crc kubenswrapper[4672]: I0930 12:35:44.566327 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cqjrd" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="registry-server" containerID="cri-o://d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69" gracePeriod=2 Sep 30 12:35:44 crc kubenswrapper[4672]: I0930 12:35:44.954674 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:44 crc kubenswrapper[4672]: I0930 12:35:44.995133 4672 generic.go:334] "Generic (PLEG): container finished" podID="a412860a-b174-46a8-82a8-9e067b45f055" containerID="1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917" exitCode=0 Sep 30 12:35:44 crc kubenswrapper[4672]: I0930 12:35:44.995223 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6jvp" event={"ID":"a412860a-b174-46a8-82a8-9e067b45f055","Type":"ContainerDied","Data":"1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917"} Sep 30 12:35:44 crc kubenswrapper[4672]: I0930 12:35:44.995279 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6jvp" event={"ID":"a412860a-b174-46a8-82a8-9e067b45f055","Type":"ContainerStarted","Data":"0556b7be248949c0ee1be0553e74a660f6c9e77f4d1dce3633fcf9e5d94d4193"} Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:44.999878 4672 generic.go:334] "Generic (PLEG): container finished" podID="c17cb96a-dd42-49e2-9994-774451dabe25" containerID="d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69" exitCode=0 Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:44.999926 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqjrd" event={"ID":"c17cb96a-dd42-49e2-9994-774451dabe25","Type":"ContainerDied","Data":"d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69"} Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:44.999959 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqjrd" event={"ID":"c17cb96a-dd42-49e2-9994-774451dabe25","Type":"ContainerDied","Data":"298d240cb743f9e9542939a1baa62b54a0c01c454443aafbe1f6ea258efa64ed"} Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:44.999988 4672 scope.go:117] "RemoveContainer" containerID="d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.000154 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqjrd" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.020751 4672 scope.go:117] "RemoveContainer" containerID="e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.041543 4672 scope.go:117] "RemoveContainer" containerID="96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.060112 4672 scope.go:117] "RemoveContainer" containerID="d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69" Sep 30 12:35:45 crc kubenswrapper[4672]: E0930 12:35:45.060529 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69\": container with ID starting with d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69 not found: ID does not exist" containerID="d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.060560 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69"} err="failed to get container status \"d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69\": rpc error: code = NotFound desc = could not find container \"d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69\": container with ID starting with d170a0f9d1ea84c09ab404997a455866dddd278263197d8ccc80427671dbcf69 not found: ID does not exist" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.060583 4672 scope.go:117] "RemoveContainer" containerID="e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4" Sep 30 12:35:45 crc kubenswrapper[4672]: E0930 12:35:45.061012 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4\": container with ID starting with e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4 not found: ID does not exist" containerID="e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.061034 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4"} err="failed to get container status \"e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4\": rpc error: code = NotFound desc = could not find container \"e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4\": container with ID starting with e7c9e3304ce51b65bcffa17fdd8d4a34aec7f10d2b500969ee1209684b42b5e4 not found: ID does not exist" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.061047 4672 scope.go:117] "RemoveContainer" containerID="96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d" Sep 30 12:35:45 crc kubenswrapper[4672]: E0930 12:35:45.061434 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d\": container with ID starting with 96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d not found: ID does not exist" containerID="96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.061501 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d"} err="failed to get container status \"96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d\": rpc error: code = NotFound desc = could not find container \"96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d\": container with ID starting with 96bd1a653181f48681ce7f60b80e638bdd8ba9a791a464b0beecc63990cee28d not found: ID does not exist" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.101350 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kvt9\" (UniqueName: \"kubernetes.io/projected/c17cb96a-dd42-49e2-9994-774451dabe25-kube-api-access-5kvt9\") pod \"c17cb96a-dd42-49e2-9994-774451dabe25\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.101441 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-catalog-content\") pod \"c17cb96a-dd42-49e2-9994-774451dabe25\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.101575 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-utilities\") pod \"c17cb96a-dd42-49e2-9994-774451dabe25\" (UID: \"c17cb96a-dd42-49e2-9994-774451dabe25\") " Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.103173 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-utilities" (OuterVolumeSpecName: "utilities") pod "c17cb96a-dd42-49e2-9994-774451dabe25" (UID: "c17cb96a-dd42-49e2-9994-774451dabe25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.107166 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17cb96a-dd42-49e2-9994-774451dabe25-kube-api-access-5kvt9" (OuterVolumeSpecName: "kube-api-access-5kvt9") pod "c17cb96a-dd42-49e2-9994-774451dabe25" (UID: "c17cb96a-dd42-49e2-9994-774451dabe25"). InnerVolumeSpecName "kube-api-access-5kvt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.114242 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c17cb96a-dd42-49e2-9994-774451dabe25" (UID: "c17cb96a-dd42-49e2-9994-774451dabe25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.203104 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.203162 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kvt9\" (UniqueName: \"kubernetes.io/projected/c17cb96a-dd42-49e2-9994-774451dabe25-kube-api-access-5kvt9\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.203185 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17cb96a-dd42-49e2-9994-774451dabe25-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.342990 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqjrd"] Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.347145 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqjrd"] Sep 30 12:35:45 crc kubenswrapper[4672]: I0930 12:35:45.431142 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" path="/var/lib/kubelet/pods/c17cb96a-dd42-49e2-9994-774451dabe25/volumes" Sep 30 12:35:47 crc kubenswrapper[4672]: I0930 12:35:47.023621 4672 generic.go:334] "Generic (PLEG): container finished" podID="a412860a-b174-46a8-82a8-9e067b45f055" containerID="695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f" exitCode=0 Sep 30 12:35:47 crc kubenswrapper[4672]: I0930 12:35:47.023728 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6jvp" event={"ID":"a412860a-b174-46a8-82a8-9e067b45f055","Type":"ContainerDied","Data":"695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f"} Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.032576 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6jvp" event={"ID":"a412860a-b174-46a8-82a8-9e067b45f055","Type":"ContainerStarted","Data":"faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c"} Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.051802 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p6jvp" podStartSLOduration=2.519021726 podStartE2EDuration="5.051785286s" podCreationTimestamp="2025-09-30 12:35:43 +0000 UTC" firstStartedPulling="2025-09-30 12:35:44.997052193 +0000 UTC m=+836.266289849" lastFinishedPulling="2025-09-30 12:35:47.529815763 +0000 UTC m=+838.799053409" observedRunningTime="2025-09-30 12:35:48.049151279 +0000 UTC m=+839.318388925" watchObservedRunningTime="2025-09-30 12:35:48.051785286 +0000 UTC m=+839.321022932" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.197912 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6"] Sep 30 12:35:48 crc kubenswrapper[4672]: E0930 12:35:48.198478 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="extract-content" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.198624 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="extract-content" Sep 30 12:35:48 crc kubenswrapper[4672]: E0930 12:35:48.198722 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="extract-utilities" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.198792 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="extract-utilities" Sep 30 12:35:48 crc kubenswrapper[4672]: E0930 12:35:48.198865 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="registry-server" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.198946 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="registry-server" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.199158 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17cb96a-dd42-49e2-9994-774451dabe25" containerName="registry-server" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.200028 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.203899 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-r5hs4" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.292231 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6"] Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.350503 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcjv2\" (UniqueName: \"kubernetes.io/projected/fc7ec117-7036-452f-9b2d-894e0dd29a8f-kube-api-access-qcjv2\") pod \"openstack-operator-controller-operator-b6f46bc96-v9ff6\" (UID: \"fc7ec117-7036-452f-9b2d-894e0dd29a8f\") " pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.452248 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcjv2\" (UniqueName: \"kubernetes.io/projected/fc7ec117-7036-452f-9b2d-894e0dd29a8f-kube-api-access-qcjv2\") pod \"openstack-operator-controller-operator-b6f46bc96-v9ff6\" (UID: \"fc7ec117-7036-452f-9b2d-894e0dd29a8f\") " pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.479811 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcjv2\" (UniqueName: \"kubernetes.io/projected/fc7ec117-7036-452f-9b2d-894e0dd29a8f-kube-api-access-qcjv2\") pod \"openstack-operator-controller-operator-b6f46bc96-v9ff6\" (UID: \"fc7ec117-7036-452f-9b2d-894e0dd29a8f\") " pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.518488 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" Sep 30 12:35:48 crc kubenswrapper[4672]: I0930 12:35:48.991430 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6"] Sep 30 12:35:49 crc kubenswrapper[4672]: I0930 12:35:49.040982 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" event={"ID":"fc7ec117-7036-452f-9b2d-894e0dd29a8f","Type":"ContainerStarted","Data":"866f436bcd25003bbb29906dd0dd4759ea08d641035a3be0856e0f3c6f460ce7"} Sep 30 12:35:53 crc kubenswrapper[4672]: I0930 12:35:53.910280 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:53 crc kubenswrapper[4672]: I0930 12:35:53.910749 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:53 crc kubenswrapper[4672]: I0930 12:35:53.973653 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:54 crc kubenswrapper[4672]: I0930 12:35:54.079710 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" event={"ID":"fc7ec117-7036-452f-9b2d-894e0dd29a8f","Type":"ContainerStarted","Data":"7e6994381d42f7b2a470cf6876194c8539b70eaeab2ece00404ccdea1e5424ad"} Sep 30 12:35:54 crc kubenswrapper[4672]: I0930 12:35:54.118488 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:55 crc kubenswrapper[4672]: I0930 12:35:55.568333 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6jvp"] Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.112022 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" event={"ID":"fc7ec117-7036-452f-9b2d-894e0dd29a8f","Type":"ContainerStarted","Data":"d60491739b1943ce7f474eb7a034c9e98117986a004459c083b556639a590023"} Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.112123 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.112551 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p6jvp" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="registry-server" containerID="cri-o://faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c" gracePeriod=2 Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.179794 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" podStartSLOduration=1.300411556 podStartE2EDuration="8.17976954s" podCreationTimestamp="2025-09-30 12:35:48 +0000 UTC" firstStartedPulling="2025-09-30 12:35:48.998851771 +0000 UTC m=+840.268089417" lastFinishedPulling="2025-09-30 12:35:55.878209745 +0000 UTC m=+847.147447401" observedRunningTime="2025-09-30 12:35:56.166398752 +0000 UTC m=+847.435636408" watchObservedRunningTime="2025-09-30 12:35:56.17976954 +0000 UTC m=+847.449007216" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.615109 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.786242 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-utilities\") pod \"a412860a-b174-46a8-82a8-9e067b45f055\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.786350 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-catalog-content\") pod \"a412860a-b174-46a8-82a8-9e067b45f055\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.786406 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/a412860a-b174-46a8-82a8-9e067b45f055-kube-api-access-njwnj\") pod \"a412860a-b174-46a8-82a8-9e067b45f055\" (UID: \"a412860a-b174-46a8-82a8-9e067b45f055\") " Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.787387 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-utilities" (OuterVolumeSpecName: "utilities") pod "a412860a-b174-46a8-82a8-9e067b45f055" (UID: "a412860a-b174-46a8-82a8-9e067b45f055"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.798511 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a412860a-b174-46a8-82a8-9e067b45f055-kube-api-access-njwnj" (OuterVolumeSpecName: "kube-api-access-njwnj") pod "a412860a-b174-46a8-82a8-9e067b45f055" (UID: "a412860a-b174-46a8-82a8-9e067b45f055"). InnerVolumeSpecName "kube-api-access-njwnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.839809 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a412860a-b174-46a8-82a8-9e067b45f055" (UID: "a412860a-b174-46a8-82a8-9e067b45f055"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.888510 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.888560 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a412860a-b174-46a8-82a8-9e067b45f055-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:56 crc kubenswrapper[4672]: I0930 12:35:56.888574 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/a412860a-b174-46a8-82a8-9e067b45f055-kube-api-access-njwnj\") on node \"crc\" DevicePath \"\"" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.120007 4672 generic.go:334] "Generic (PLEG): container finished" podID="a412860a-b174-46a8-82a8-9e067b45f055" containerID="faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c" exitCode=0 Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.120047 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6jvp" event={"ID":"a412860a-b174-46a8-82a8-9e067b45f055","Type":"ContainerDied","Data":"faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c"} Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.120088 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6jvp" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.120114 4672 scope.go:117] "RemoveContainer" containerID="faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.120097 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6jvp" event={"ID":"a412860a-b174-46a8-82a8-9e067b45f055","Type":"ContainerDied","Data":"0556b7be248949c0ee1be0553e74a660f6c9e77f4d1dce3633fcf9e5d94d4193"} Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.141504 4672 scope.go:117] "RemoveContainer" containerID="695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.159663 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6jvp"] Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.166352 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p6jvp"] Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.173684 4672 scope.go:117] "RemoveContainer" containerID="1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.194975 4672 scope.go:117] "RemoveContainer" containerID="faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c" Sep 30 12:35:57 crc kubenswrapper[4672]: E0930 12:35:57.196093 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c\": container with ID starting with faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c not found: ID does not exist" containerID="faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.196131 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c"} err="failed to get container status \"faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c\": rpc error: code = NotFound desc = could not find container \"faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c\": container with ID starting with faf1f44e10fbfe6e880ddfd2391bcaef83d43dc0da2255ae3b5fc132e2edfa8c not found: ID does not exist" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.196164 4672 scope.go:117] "RemoveContainer" containerID="695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f" Sep 30 12:35:57 crc kubenswrapper[4672]: E0930 12:35:57.196431 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f\": container with ID starting with 695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f not found: ID does not exist" containerID="695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.196458 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f"} err="failed to get container status \"695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f\": rpc error: code = NotFound desc = could not find container \"695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f\": container with ID starting with 695cc2116d7e50e2b6fa37b889c1d1f06f474fb7fc24c864ac5b40a37e49ca3f not found: ID does not exist" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.196478 4672 scope.go:117] "RemoveContainer" containerID="1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917" Sep 30 12:35:57 crc kubenswrapper[4672]: E0930 12:35:57.196705 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917\": container with ID starting with 1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917 not found: ID does not exist" containerID="1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.196729 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917"} err="failed to get container status \"1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917\": rpc error: code = NotFound desc = could not find container \"1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917\": container with ID starting with 1b7dee691a421da457573a2900f29cba14e4e824e76d772508355ee4e869d917 not found: ID does not exist" Sep 30 12:35:57 crc kubenswrapper[4672]: I0930 12:35:57.428297 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a412860a-b174-46a8-82a8-9e067b45f055" path="/var/lib/kubelet/pods/a412860a-b174-46a8-82a8-9e067b45f055/volumes" Sep 30 12:35:58 crc kubenswrapper[4672]: I0930 12:35:58.522655 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-b6f46bc96-v9ff6" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.903957 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s"] Sep 30 12:36:14 crc kubenswrapper[4672]: E0930 12:36:14.905435 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="extract-content" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.905453 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="extract-content" Sep 30 12:36:14 crc kubenswrapper[4672]: E0930 12:36:14.905467 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="extract-utilities" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.905476 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="extract-utilities" Sep 30 12:36:14 crc kubenswrapper[4672]: E0930 12:36:14.905491 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="registry-server" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.905499 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="registry-server" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.905669 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a412860a-b174-46a8-82a8-9e067b45f055" containerName="registry-server" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.911355 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr"] Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.913087 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.915405 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.917789 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7"] Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.919258 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.919790 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-96xzr" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.923808 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-hjz8q" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.924217 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v9g25" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.943299 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s"] Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.959955 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr"] Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.975094 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7"] Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.983494 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb"] Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.985572 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.987881 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-n2mwk" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.993713 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr"] Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.995145 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" Sep 30 12:36:14 crc kubenswrapper[4672]: I0930 12:36:14.998245 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qb79b" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.019129 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.023698 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.025238 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.031972 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7vvcf" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.056556 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.071305 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p28x4\" (UniqueName: \"kubernetes.io/projected/6cc4cd4e-abd0-4318-bfd4-e2df45940139-kube-api-access-p28x4\") pod \"designate-operator-controller-manager-84f4f7b77b-gthb7\" (UID: \"6cc4cd4e-abd0-4318-bfd4-e2df45940139\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.071395 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrlhg\" (UniqueName: \"kubernetes.io/projected/892875f4-bce3-47cb-8478-9d6bbc819bb1-kube-api-access-rrlhg\") pod \"cinder-operator-controller-manager-644bddb6d8-jmkcr\" (UID: \"892875f4-bce3-47cb-8478-9d6bbc819bb1\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.071444 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2gcx\" (UniqueName: \"kubernetes.io/projected/c70c8b28-1f45-4c79-af69-3197c7f66fa0-kube-api-access-q2gcx\") pod \"barbican-operator-controller-manager-6ff8b75857-df89s\" (UID: \"c70c8b28-1f45-4c79-af69-3197c7f66fa0\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.086632 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.105007 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.106254 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.113945 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.114236 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jmwc5" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.154804 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.156206 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.168859 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-x64tc" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.173352 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn7mr\" (UniqueName: \"kubernetes.io/projected/b12c1847-2238-4a91-a2a0-4de492556fe7-kube-api-access-dn7mr\") pod \"heat-operator-controller-manager-5d889d78cf-vs4mr\" (UID: \"b12c1847-2238-4a91-a2a0-4de492556fe7\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.173416 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p28x4\" (UniqueName: \"kubernetes.io/projected/6cc4cd4e-abd0-4318-bfd4-e2df45940139-kube-api-access-p28x4\") pod \"designate-operator-controller-manager-84f4f7b77b-gthb7\" (UID: \"6cc4cd4e-abd0-4318-bfd4-e2df45940139\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.173461 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk59l\" (UniqueName: \"kubernetes.io/projected/0e2c3398-4a1f-4a82-a95c-89e73d9a4485-kube-api-access-xk59l\") pod \"glance-operator-controller-manager-84958c4d49-fn2cb\" (UID: \"0e2c3398-4a1f-4a82-a95c-89e73d9a4485\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.173500 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrlhg\" (UniqueName: \"kubernetes.io/projected/892875f4-bce3-47cb-8478-9d6bbc819bb1-kube-api-access-rrlhg\") pod \"cinder-operator-controller-manager-644bddb6d8-jmkcr\" (UID: \"892875f4-bce3-47cb-8478-9d6bbc819bb1\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.173533 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2gcx\" (UniqueName: \"kubernetes.io/projected/c70c8b28-1f45-4c79-af69-3197c7f66fa0-kube-api-access-q2gcx\") pod \"barbican-operator-controller-manager-6ff8b75857-df89s\" (UID: \"c70c8b28-1f45-4c79-af69-3197c7f66fa0\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.173559 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfzt\" (UniqueName: \"kubernetes.io/projected/ed25b409-1ca7-4fc4-95b5-55b4239233f3-kube-api-access-2xfzt\") pod \"horizon-operator-controller-manager-9f4696d94-z9tmm\" (UID: \"ed25b409-1ca7-4fc4-95b5-55b4239233f3\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.175324 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.184728 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.185932 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.192461 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-t569n" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.196386 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.213832 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p28x4\" (UniqueName: \"kubernetes.io/projected/6cc4cd4e-abd0-4318-bfd4-e2df45940139-kube-api-access-p28x4\") pod \"designate-operator-controller-manager-84f4f7b77b-gthb7\" (UID: \"6cc4cd4e-abd0-4318-bfd4-e2df45940139\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.218734 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2gcx\" (UniqueName: \"kubernetes.io/projected/c70c8b28-1f45-4c79-af69-3197c7f66fa0-kube-api-access-q2gcx\") pod \"barbican-operator-controller-manager-6ff8b75857-df89s\" (UID: \"c70c8b28-1f45-4c79-af69-3197c7f66fa0\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.225407 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.230773 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrlhg\" (UniqueName: \"kubernetes.io/projected/892875f4-bce3-47cb-8478-9d6bbc819bb1-kube-api-access-rrlhg\") pod \"cinder-operator-controller-manager-644bddb6d8-jmkcr\" (UID: \"892875f4-bce3-47cb-8478-9d6bbc819bb1\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.230773 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.253483 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.258531 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-xq89z"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.258543 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-24t92" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.263229 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.268602 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-mkdpf" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.269540 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.277081 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhg8q\" (UniqueName: \"kubernetes.io/projected/904a2d6e-693a-4c5e-926e-2c5fd47d6bea-kube-api-access-mhg8q\") pod \"ironic-operator-controller-manager-7975b88857-g8lll\" (UID: \"904a2d6e-693a-4c5e-926e-2c5fd47d6bea\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.277261 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fa26cab-ae65-4e21-af16-2628c86be254-cert\") pod \"infra-operator-controller-manager-7d857cc749-8qftk\" (UID: \"6fa26cab-ae65-4e21-af16-2628c86be254\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.277419 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfzt\" (UniqueName: \"kubernetes.io/projected/ed25b409-1ca7-4fc4-95b5-55b4239233f3-kube-api-access-2xfzt\") pod \"horizon-operator-controller-manager-9f4696d94-z9tmm\" (UID: \"ed25b409-1ca7-4fc4-95b5-55b4239233f3\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.277514 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96gqz\" (UniqueName: \"kubernetes.io/projected/6fa26cab-ae65-4e21-af16-2628c86be254-kube-api-access-96gqz\") pod \"infra-operator-controller-manager-7d857cc749-8qftk\" (UID: \"6fa26cab-ae65-4e21-af16-2628c86be254\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.277619 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn7mr\" (UniqueName: \"kubernetes.io/projected/b12c1847-2238-4a91-a2a0-4de492556fe7-kube-api-access-dn7mr\") pod \"heat-operator-controller-manager-5d889d78cf-vs4mr\" (UID: \"b12c1847-2238-4a91-a2a0-4de492556fe7\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.277717 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk59l\" (UniqueName: \"kubernetes.io/projected/0e2c3398-4a1f-4a82-a95c-89e73d9a4485-kube-api-access-xk59l\") pod \"glance-operator-controller-manager-84958c4d49-fn2cb\" (UID: \"0e2c3398-4a1f-4a82-a95c-89e73d9a4485\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.278611 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.284835 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.290638 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.292898 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.294896 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7cnrn" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.304366 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.311056 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfzt\" (UniqueName: \"kubernetes.io/projected/ed25b409-1ca7-4fc4-95b5-55b4239233f3-kube-api-access-2xfzt\") pod \"horizon-operator-controller-manager-9f4696d94-z9tmm\" (UID: \"ed25b409-1ca7-4fc4-95b5-55b4239233f3\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.316824 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.317645 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn7mr\" (UniqueName: \"kubernetes.io/projected/b12c1847-2238-4a91-a2a0-4de492556fe7-kube-api-access-dn7mr\") pod \"heat-operator-controller-manager-5d889d78cf-vs4mr\" (UID: \"b12c1847-2238-4a91-a2a0-4de492556fe7\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.319112 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-xq89z"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.331410 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.333386 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.347842 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-p6x7s" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.348171 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk59l\" (UniqueName: \"kubernetes.io/projected/0e2c3398-4a1f-4a82-a95c-89e73d9a4485-kube-api-access-xk59l\") pod \"glance-operator-controller-manager-84958c4d49-fn2cb\" (UID: \"0e2c3398-4a1f-4a82-a95c-89e73d9a4485\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.355204 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.356725 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.357551 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.359306 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ch2gt" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.378824 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.380440 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhg8q\" (UniqueName: \"kubernetes.io/projected/904a2d6e-693a-4c5e-926e-2c5fd47d6bea-kube-api-access-mhg8q\") pod \"ironic-operator-controller-manager-7975b88857-g8lll\" (UID: \"904a2d6e-693a-4c5e-926e-2c5fd47d6bea\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.380472 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fa26cab-ae65-4e21-af16-2628c86be254-cert\") pod \"infra-operator-controller-manager-7d857cc749-8qftk\" (UID: \"6fa26cab-ae65-4e21-af16-2628c86be254\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.380512 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96gqz\" (UniqueName: \"kubernetes.io/projected/6fa26cab-ae65-4e21-af16-2628c86be254-kube-api-access-96gqz\") pod \"infra-operator-controller-manager-7d857cc749-8qftk\" (UID: \"6fa26cab-ae65-4e21-af16-2628c86be254\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.380540 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kggs\" (UniqueName: \"kubernetes.io/projected/3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06-kube-api-access-8kggs\") pod \"mariadb-operator-controller-manager-88c7-xq89z\" (UID: \"3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.380571 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6vbt\" (UniqueName: \"kubernetes.io/projected/ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0-kube-api-access-b6vbt\") pod \"keystone-operator-controller-manager-5bd55b4bff-vqxhh\" (UID: \"ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.380591 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8hm2\" (UniqueName: \"kubernetes.io/projected/32830807-0fb2-4545-a629-af52b20e0b0f-kube-api-access-f8hm2\") pod \"manila-operator-controller-manager-6d68dbc695-2hvxs\" (UID: \"32830807-0fb2-4545-a629-af52b20e0b0f\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" Sep 30 12:36:15 crc kubenswrapper[4672]: E0930 12:36:15.381224 4672 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 12:36:15 crc kubenswrapper[4672]: E0930 12:36:15.381284 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fa26cab-ae65-4e21-af16-2628c86be254-cert podName:6fa26cab-ae65-4e21-af16-2628c86be254 nodeName:}" failed. No retries permitted until 2025-09-30 12:36:15.881254027 +0000 UTC m=+867.150491673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fa26cab-ae65-4e21-af16-2628c86be254-cert") pod "infra-operator-controller-manager-7d857cc749-8qftk" (UID: "6fa26cab-ae65-4e21-af16-2628c86be254") : secret "infra-operator-webhook-server-cert" not found Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.386431 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.433871 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96gqz\" (UniqueName: \"kubernetes.io/projected/6fa26cab-ae65-4e21-af16-2628c86be254-kube-api-access-96gqz\") pod \"infra-operator-controller-manager-7d857cc749-8qftk\" (UID: \"6fa26cab-ae65-4e21-af16-2628c86be254\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.457045 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhg8q\" (UniqueName: \"kubernetes.io/projected/904a2d6e-693a-4c5e-926e-2c5fd47d6bea-kube-api-access-mhg8q\") pod \"ironic-operator-controller-manager-7975b88857-g8lll\" (UID: \"904a2d6e-693a-4c5e-926e-2c5fd47d6bea\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.484093 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd4lb\" (UniqueName: \"kubernetes.io/projected/601fcd4a-dc2f-468d-9ad6-6b173320c317-kube-api-access-xd4lb\") pod \"octavia-operator-controller-manager-76fcc6dc7c-hjn2l\" (UID: \"601fcd4a-dc2f-468d-9ad6-6b173320c317\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.484149 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kggs\" (UniqueName: \"kubernetes.io/projected/3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06-kube-api-access-8kggs\") pod \"mariadb-operator-controller-manager-88c7-xq89z\" (UID: \"3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.484181 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6vbt\" (UniqueName: \"kubernetes.io/projected/ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0-kube-api-access-b6vbt\") pod \"keystone-operator-controller-manager-5bd55b4bff-vqxhh\" (UID: \"ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.484199 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8hm2\" (UniqueName: \"kubernetes.io/projected/32830807-0fb2-4545-a629-af52b20e0b0f-kube-api-access-f8hm2\") pod \"manila-operator-controller-manager-6d68dbc695-2hvxs\" (UID: \"32830807-0fb2-4545-a629-af52b20e0b0f\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.484250 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27wws\" (UniqueName: \"kubernetes.io/projected/9aef7bd6-dab2-4333-b248-a40c44bc3743-kube-api-access-27wws\") pod \"neutron-operator-controller-manager-64d7b59854-lpv98\" (UID: \"9aef7bd6-dab2-4333-b248-a40c44bc3743\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.484284 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltgb\" (UniqueName: \"kubernetes.io/projected/d4c88a65-e12f-4872-baf2-f210ee1b0c9a-kube-api-access-vltgb\") pod \"nova-operator-controller-manager-c7c776c96-2kvn7\" (UID: \"d4c88a65-e12f-4872-baf2-f210ee1b0c9a\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.500696 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.507088 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kggs\" (UniqueName: \"kubernetes.io/projected/3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06-kube-api-access-8kggs\") pod \"mariadb-operator-controller-manager-88c7-xq89z\" (UID: \"3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.511532 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6vbt\" (UniqueName: \"kubernetes.io/projected/ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0-kube-api-access-b6vbt\") pod \"keystone-operator-controller-manager-5bd55b4bff-vqxhh\" (UID: \"ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.555670 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8hm2\" (UniqueName: \"kubernetes.io/projected/32830807-0fb2-4545-a629-af52b20e0b0f-kube-api-access-f8hm2\") pod \"manila-operator-controller-manager-6d68dbc695-2hvxs\" (UID: \"32830807-0fb2-4545-a629-af52b20e0b0f\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.576003 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.576063 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.578706 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.585537 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd4lb\" (UniqueName: \"kubernetes.io/projected/601fcd4a-dc2f-468d-9ad6-6b173320c317-kube-api-access-xd4lb\") pod \"octavia-operator-controller-manager-76fcc6dc7c-hjn2l\" (UID: \"601fcd4a-dc2f-468d-9ad6-6b173320c317\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.585706 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27wws\" (UniqueName: \"kubernetes.io/projected/9aef7bd6-dab2-4333-b248-a40c44bc3743-kube-api-access-27wws\") pod \"neutron-operator-controller-manager-64d7b59854-lpv98\" (UID: \"9aef7bd6-dab2-4333-b248-a40c44bc3743\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.585730 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vltgb\" (UniqueName: \"kubernetes.io/projected/d4c88a65-e12f-4872-baf2-f210ee1b0c9a-kube-api-access-vltgb\") pod \"nova-operator-controller-manager-c7c776c96-2kvn7\" (UID: \"d4c88a65-e12f-4872-baf2-f210ee1b0c9a\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.591795 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.596469 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.606687 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.606830 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.622201 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.622802 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-lx7hj" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.629591 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.631192 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.635711 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.638204 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7twxp" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.638376 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27wws\" (UniqueName: \"kubernetes.io/projected/9aef7bd6-dab2-4333-b248-a40c44bc3743-kube-api-access-27wws\") pod \"neutron-operator-controller-manager-64d7b59854-lpv98\" (UID: \"9aef7bd6-dab2-4333-b248-a40c44bc3743\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.638421 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-t5w2s" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.640337 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd4lb\" (UniqueName: \"kubernetes.io/projected/601fcd4a-dc2f-468d-9ad6-6b173320c317-kube-api-access-xd4lb\") pod \"octavia-operator-controller-manager-76fcc6dc7c-hjn2l\" (UID: \"601fcd4a-dc2f-468d-9ad6-6b173320c317\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.663336 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.666772 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.674700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltgb\" (UniqueName: \"kubernetes.io/projected/d4c88a65-e12f-4872-baf2-f210ee1b0c9a-kube-api-access-vltgb\") pod \"nova-operator-controller-manager-c7c776c96-2kvn7\" (UID: \"d4c88a65-e12f-4872-baf2-f210ee1b0c9a\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.676883 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.685962 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.687149 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.687199 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wfs\" (UniqueName: \"kubernetes.io/projected/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-kube-api-access-t6wfs\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.704422 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.713461 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.720217 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.720346 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.725760 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fzm2t" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.725927 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.728714 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.735319 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fhcgx" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.736087 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.764604 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.767588 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.782060 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-bd469" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.787891 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.790736 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.791716 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.791766 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wfs\" (UniqueName: \"kubernetes.io/projected/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-kube-api-access-t6wfs\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.791809 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxls2\" (UniqueName: \"kubernetes.io/projected/42bf0afd-961a-4353-9499-a185b16b8a02-kube-api-access-vxls2\") pod \"placement-operator-controller-manager-589c58c6c-r9sqd\" (UID: \"42bf0afd-961a-4353-9499-a185b16b8a02\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.791835 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2d5v\" (UniqueName: \"kubernetes.io/projected/5a756e54-c5bf-480b-aa89-57ca440d1ddc-kube-api-access-x2d5v\") pod \"ovn-operator-controller-manager-9976ff44c-zxb4j\" (UID: \"5a756e54-c5bf-480b-aa89-57ca440d1ddc\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" Sep 30 12:36:15 crc kubenswrapper[4672]: E0930 12:36:15.792011 4672 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 12:36:15 crc kubenswrapper[4672]: E0930 12:36:15.792053 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert podName:1b6d7bf0-00ff-41df-8873-cab7f6e5eeea nodeName:}" failed. No retries permitted until 2025-09-30 12:36:16.29203694 +0000 UTC m=+867.561274586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-x6x67" (UID: "1b6d7bf0-00ff-41df-8873-cab7f6e5eeea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.795664 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.797488 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.800378 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.801122 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6krm6" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.816621 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.820416 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.825067 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wfs\" (UniqueName: \"kubernetes.io/projected/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-kube-api-access-t6wfs\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.827475 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.827602 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.832937 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zrhhh" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.833172 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.850510 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.851757 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.857632 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jr7dp" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.871179 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk"] Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.895152 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2d5v\" (UniqueName: \"kubernetes.io/projected/5a756e54-c5bf-480b-aa89-57ca440d1ddc-kube-api-access-x2d5v\") pod \"ovn-operator-controller-manager-9976ff44c-zxb4j\" (UID: \"5a756e54-c5bf-480b-aa89-57ca440d1ddc\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.895204 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8qv2\" (UniqueName: \"kubernetes.io/projected/4d10ceb0-c730-4fb8-b81c-a87e33890f84-kube-api-access-f8qv2\") pod \"watcher-operator-controller-manager-58675bf858-qg9s4\" (UID: \"4d10ceb0-c730-4fb8-b81c-a87e33890f84\") " pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.895241 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fa26cab-ae65-4e21-af16-2628c86be254-cert\") pod \"infra-operator-controller-manager-7d857cc749-8qftk\" (UID: \"6fa26cab-ae65-4e21-af16-2628c86be254\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.895270 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8lbw\" (UniqueName: \"kubernetes.io/projected/cf21b89f-fcd8-4854-954b-06927bc7c6ea-kube-api-access-k8lbw\") pod \"swift-operator-controller-manager-bc7dc7bd9-8mf7d\" (UID: \"cf21b89f-fcd8-4854-954b-06927bc7c6ea\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.895347 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt8jp\" (UniqueName: \"kubernetes.io/projected/74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7-kube-api-access-wt8jp\") pod \"test-operator-controller-manager-f66b554c6-d7gbf\" (UID: \"74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.895393 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjxxb\" (UniqueName: \"kubernetes.io/projected/e6b8eb11-36d8-45c1-b600-76ffff076b78-kube-api-access-kjxxb\") pod \"telemetry-operator-controller-manager-b8d54b5d7-qxq7h\" (UID: \"e6b8eb11-36d8-45c1-b600-76ffff076b78\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.895459 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxls2\" (UniqueName: \"kubernetes.io/projected/42bf0afd-961a-4353-9499-a185b16b8a02-kube-api-access-vxls2\") pod \"placement-operator-controller-manager-589c58c6c-r9sqd\" (UID: \"42bf0afd-961a-4353-9499-a185b16b8a02\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.905689 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fa26cab-ae65-4e21-af16-2628c86be254-cert\") pod \"infra-operator-controller-manager-7d857cc749-8qftk\" (UID: \"6fa26cab-ae65-4e21-af16-2628c86be254\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.921356 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxls2\" (UniqueName: \"kubernetes.io/projected/42bf0afd-961a-4353-9499-a185b16b8a02-kube-api-access-vxls2\") pod \"placement-operator-controller-manager-589c58c6c-r9sqd\" (UID: \"42bf0afd-961a-4353-9499-a185b16b8a02\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" Sep 30 12:36:15 crc kubenswrapper[4672]: I0930 12:36:15.931693 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2d5v\" (UniqueName: \"kubernetes.io/projected/5a756e54-c5bf-480b-aa89-57ca440d1ddc-kube-api-access-x2d5v\") pod \"ovn-operator-controller-manager-9976ff44c-zxb4j\" (UID: \"5a756e54-c5bf-480b-aa89-57ca440d1ddc\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.000040 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8qv2\" (UniqueName: \"kubernetes.io/projected/4d10ceb0-c730-4fb8-b81c-a87e33890f84-kube-api-access-f8qv2\") pod \"watcher-operator-controller-manager-58675bf858-qg9s4\" (UID: \"4d10ceb0-c730-4fb8-b81c-a87e33890f84\") " pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.000468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8lbw\" (UniqueName: \"kubernetes.io/projected/cf21b89f-fcd8-4854-954b-06927bc7c6ea-kube-api-access-k8lbw\") pod \"swift-operator-controller-manager-bc7dc7bd9-8mf7d\" (UID: \"cf21b89f-fcd8-4854-954b-06927bc7c6ea\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.000530 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sggdh\" (UniqueName: \"kubernetes.io/projected/4f1099dc-e100-44d5-8d17-255dbe0edf63-kube-api-access-sggdh\") pod \"rabbitmq-cluster-operator-manager-79d8469568-qxpkk\" (UID: \"4f1099dc-e100-44d5-8d17-255dbe0edf63\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.000636 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qkhn\" (UniqueName: \"kubernetes.io/projected/711b218e-6a25-4d4f-b657-e621e9d1d658-kube-api-access-7qkhn\") pod \"openstack-operator-controller-manager-6bd96d9cc5-cf4sl\" (UID: \"711b218e-6a25-4d4f-b657-e621e9d1d658\") " pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.000699 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt8jp\" (UniqueName: \"kubernetes.io/projected/74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7-kube-api-access-wt8jp\") pod \"test-operator-controller-manager-f66b554c6-d7gbf\" (UID: \"74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.000799 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjxxb\" (UniqueName: \"kubernetes.io/projected/e6b8eb11-36d8-45c1-b600-76ffff076b78-kube-api-access-kjxxb\") pod \"telemetry-operator-controller-manager-b8d54b5d7-qxq7h\" (UID: \"e6b8eb11-36d8-45c1-b600-76ffff076b78\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.000888 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711b218e-6a25-4d4f-b657-e621e9d1d658-cert\") pod \"openstack-operator-controller-manager-6bd96d9cc5-cf4sl\" (UID: \"711b218e-6a25-4d4f-b657-e621e9d1d658\") " pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.025033 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjxxb\" (UniqueName: \"kubernetes.io/projected/e6b8eb11-36d8-45c1-b600-76ffff076b78-kube-api-access-kjxxb\") pod \"telemetry-operator-controller-manager-b8d54b5d7-qxq7h\" (UID: \"e6b8eb11-36d8-45c1-b600-76ffff076b78\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.026811 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8qv2\" (UniqueName: \"kubernetes.io/projected/4d10ceb0-c730-4fb8-b81c-a87e33890f84-kube-api-access-f8qv2\") pod \"watcher-operator-controller-manager-58675bf858-qg9s4\" (UID: \"4d10ceb0-c730-4fb8-b81c-a87e33890f84\") " pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.037680 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt8jp\" (UniqueName: \"kubernetes.io/projected/74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7-kube-api-access-wt8jp\") pod \"test-operator-controller-manager-f66b554c6-d7gbf\" (UID: \"74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.071144 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.075080 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8lbw\" (UniqueName: \"kubernetes.io/projected/cf21b89f-fcd8-4854-954b-06927bc7c6ea-kube-api-access-k8lbw\") pod \"swift-operator-controller-manager-bc7dc7bd9-8mf7d\" (UID: \"cf21b89f-fcd8-4854-954b-06927bc7c6ea\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.102692 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sggdh\" (UniqueName: \"kubernetes.io/projected/4f1099dc-e100-44d5-8d17-255dbe0edf63-kube-api-access-sggdh\") pod \"rabbitmq-cluster-operator-manager-79d8469568-qxpkk\" (UID: \"4f1099dc-e100-44d5-8d17-255dbe0edf63\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.102802 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qkhn\" (UniqueName: \"kubernetes.io/projected/711b218e-6a25-4d4f-b657-e621e9d1d658-kube-api-access-7qkhn\") pod \"openstack-operator-controller-manager-6bd96d9cc5-cf4sl\" (UID: \"711b218e-6a25-4d4f-b657-e621e9d1d658\") " pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.102930 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711b218e-6a25-4d4f-b657-e621e9d1d658-cert\") pod \"openstack-operator-controller-manager-6bd96d9cc5-cf4sl\" (UID: \"711b218e-6a25-4d4f-b657-e621e9d1d658\") " pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:16 crc kubenswrapper[4672]: E0930 12:36:16.103128 4672 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 12:36:16 crc kubenswrapper[4672]: E0930 12:36:16.103205 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711b218e-6a25-4d4f-b657-e621e9d1d658-cert podName:711b218e-6a25-4d4f-b657-e621e9d1d658 nodeName:}" failed. No retries permitted until 2025-09-30 12:36:16.603181298 +0000 UTC m=+867.872418944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/711b218e-6a25-4d4f-b657-e621e9d1d658-cert") pod "openstack-operator-controller-manager-6bd96d9cc5-cf4sl" (UID: "711b218e-6a25-4d4f-b657-e621e9d1d658") : secret "webhook-server-cert" not found Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.161858 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.176457 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sggdh\" (UniqueName: \"kubernetes.io/projected/4f1099dc-e100-44d5-8d17-255dbe0edf63-kube-api-access-sggdh\") pod \"rabbitmq-cluster-operator-manager-79d8469568-qxpkk\" (UID: \"4f1099dc-e100-44d5-8d17-255dbe0edf63\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.179750 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qkhn\" (UniqueName: \"kubernetes.io/projected/711b218e-6a25-4d4f-b657-e621e9d1d658-kube-api-access-7qkhn\") pod \"openstack-operator-controller-manager-6bd96d9cc5-cf4sl\" (UID: \"711b218e-6a25-4d4f-b657-e621e9d1d658\") " pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.187792 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.222176 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.243115 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.266923 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.272344 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr"] Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.276203 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s"] Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.301562 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.306808 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:16 crc kubenswrapper[4672]: E0930 12:36:16.307009 4672 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 12:36:16 crc kubenswrapper[4672]: E0930 12:36:16.307061 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert podName:1b6d7bf0-00ff-41df-8873-cab7f6e5eeea nodeName:}" failed. No retries permitted until 2025-09-30 12:36:17.307043326 +0000 UTC m=+868.576280972 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-x6x67" (UID: "1b6d7bf0-00ff-41df-8873-cab7f6e5eeea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.380169 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" Sep 30 12:36:16 crc kubenswrapper[4672]: W0930 12:36:16.505466 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70c8b28_1f45_4c79_af69_3197c7f66fa0.slice/crio-fb56f9dbb3fa53555b86d9e86c073c079a046b7057902a0964bf1566e614bc18 WatchSource:0}: Error finding container fb56f9dbb3fa53555b86d9e86c073c079a046b7057902a0964bf1566e614bc18: Status 404 returned error can't find the container with id fb56f9dbb3fa53555b86d9e86c073c079a046b7057902a0964bf1566e614bc18 Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.614347 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711b218e-6a25-4d4f-b657-e621e9d1d658-cert\") pod \"openstack-operator-controller-manager-6bd96d9cc5-cf4sl\" (UID: \"711b218e-6a25-4d4f-b657-e621e9d1d658\") " pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.620991 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/711b218e-6a25-4d4f-b657-e621e9d1d658-cert\") pod \"openstack-operator-controller-manager-6bd96d9cc5-cf4sl\" (UID: \"711b218e-6a25-4d4f-b657-e621e9d1d658\") " pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.879053 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm"] Sep 30 12:36:16 crc kubenswrapper[4672]: W0930 12:36:16.886909 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded25b409_1ca7_4fc4_95b5_55b4239233f3.slice/crio-fc51b38a39aebf34914b649a0ee9fb8c5d78bfebf5dd952a8567c59e1ef9c470 WatchSource:0}: Error finding container fc51b38a39aebf34914b649a0ee9fb8c5d78bfebf5dd952a8567c59e1ef9c470: Status 404 returned error can't find the container with id fc51b38a39aebf34914b649a0ee9fb8c5d78bfebf5dd952a8567c59e1ef9c470 Sep 30 12:36:16 crc kubenswrapper[4672]: I0930 12:36:16.920090 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.325036 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.344136 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b6d7bf0-00ff-41df-8873-cab7f6e5eeea-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-x6x67\" (UID: \"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.344566 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" event={"ID":"c70c8b28-1f45-4c79-af69-3197c7f66fa0","Type":"ContainerStarted","Data":"fb56f9dbb3fa53555b86d9e86c073c079a046b7057902a0964bf1566e614bc18"} Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.347208 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" event={"ID":"ed25b409-1ca7-4fc4-95b5-55b4239233f3","Type":"ContainerStarted","Data":"fc51b38a39aebf34914b649a0ee9fb8c5d78bfebf5dd952a8567c59e1ef9c470"} Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.348010 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" event={"ID":"892875f4-bce3-47cb-8478-9d6bbc819bb1","Type":"ContainerStarted","Data":"363cf5606163d53ca124a949424bc16af2a31eab99df41de6ac0dd333180e9ff"} Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.530571 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.556411 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-xq89z"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.566716 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.575009 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.600573 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7"] Sep 30 12:36:17 crc kubenswrapper[4672]: W0930 12:36:17.602382 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aef7bd6_dab2_4333_b248_a40c44bc3743.slice/crio-5b9f7c0ea8184ece6f0da5f7a229014098d7a0b77889162f317cd503b358d441 WatchSource:0}: Error finding container 5b9f7c0ea8184ece6f0da5f7a229014098d7a0b77889162f317cd503b358d441: Status 404 returned error can't find the container with id 5b9f7c0ea8184ece6f0da5f7a229014098d7a0b77889162f317cd503b358d441 Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.611632 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb"] Sep 30 12:36:17 crc kubenswrapper[4672]: W0930 12:36:17.616960 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb12c1847_2238_4a91_a2a0_4de492556fe7.slice/crio-eb1a32ac301ecd9837dcff29cffb4209f5a0cf03c3ded707fdcd3d586b0f8f9e WatchSource:0}: Error finding container eb1a32ac301ecd9837dcff29cffb4209f5a0cf03c3ded707fdcd3d586b0f8f9e: Status 404 returned error can't find the container with id eb1a32ac301ecd9837dcff29cffb4209f5a0cf03c3ded707fdcd3d586b0f8f9e Sep 30 12:36:17 crc kubenswrapper[4672]: W0930 12:36:17.618606 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32830807_0fb2_4545_a629_af52b20e0b0f.slice/crio-5ee2110fac1e254063fc54d821b8379eb34db992fa90d50e37cfa9e61faf30b3 WatchSource:0}: Error finding container 5ee2110fac1e254063fc54d821b8379eb34db992fa90d50e37cfa9e61faf30b3: Status 404 returned error can't find the container with id 5ee2110fac1e254063fc54d821b8379eb34db992fa90d50e37cfa9e61faf30b3 Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.624157 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.626324 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.630324 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.637219 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.650522 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.714477 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.729234 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.746576 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk"] Sep 30 12:36:17 crc kubenswrapper[4672]: I0930 12:36:17.759118 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4"] Sep 30 12:36:17 crc kubenswrapper[4672]: E0930 12:36:17.774575 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.83:5001/openstack-k8s-operators/watcher-operator:3f7eac74f76f24bd9da5f9316305cdeb8d1ba29c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f8qv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-58675bf858-qg9s4_openstack-operators(4d10ceb0-c730-4fb8-b81c-a87e33890f84): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 12:36:17 crc kubenswrapper[4672]: W0930 12:36:17.775611 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1099dc_e100_44d5_8d17_255dbe0edf63.slice/crio-e4afaf1250c063127f6f582417a7f1133bd8c97f51a116faee0ac84b4aa5292d WatchSource:0}: Error finding container e4afaf1250c063127f6f582417a7f1133bd8c97f51a116faee0ac84b4aa5292d: Status 404 returned error can't find the container with id e4afaf1250c063127f6f582417a7f1133bd8c97f51a116faee0ac84b4aa5292d Sep 30 12:36:17 crc kubenswrapper[4672]: E0930 12:36:17.784875 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vxls2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-r9sqd_openstack-operators(42bf0afd-961a-4353-9499-a185b16b8a02): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 12:36:17 crc kubenswrapper[4672]: W0930 12:36:17.787726 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a756e54_c5bf_480b_aa89_57ca440d1ddc.slice/crio-51ecb77a15977c39fe720f07dcbb54a550f8757bd8d6031aa9767ed2404b2083 WatchSource:0}: Error finding container 51ecb77a15977c39fe720f07dcbb54a550f8757bd8d6031aa9767ed2404b2083: Status 404 returned error can't find the container with id 51ecb77a15977c39fe720f07dcbb54a550f8757bd8d6031aa9767ed2404b2083 Sep 30 12:36:17 crc kubenswrapper[4672]: E0930 12:36:17.823300 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x2d5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-zxb4j_openstack-operators(5a756e54-c5bf-480b-aa89-57ca440d1ddc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.051375 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67"] Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.055048 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk"] Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.071866 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf"] Sep 30 12:36:18 crc kubenswrapper[4672]: W0930 12:36:18.132322 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa26cab_ae65_4e21_af16_2628c86be254.slice/crio-b8641489b4f1a80abe85ec2e3bfd4e62e5d2a3c4dfb4b5a94d606c7a4300a406 WatchSource:0}: Error finding container b8641489b4f1a80abe85ec2e3bfd4e62e5d2a3c4dfb4b5a94d606c7a4300a406: Status 404 returned error can't find the container with id b8641489b4f1a80abe85ec2e3bfd4e62e5d2a3c4dfb4b5a94d606c7a4300a406 Sep 30 12:36:18 crc kubenswrapper[4672]: W0930 12:36:18.144136 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6d7bf0_00ff_41df_8873_cab7f6e5eeea.slice/crio-a07a660190cfd6bbac2837c42f27a95785599d8a0688d9724854c46d2f060346 WatchSource:0}: Error finding container a07a660190cfd6bbac2837c42f27a95785599d8a0688d9724854c46d2f060346: Status 404 returned error can't find the container with id a07a660190cfd6bbac2837c42f27a95785599d8a0688d9724854c46d2f060346 Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.144341 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d"] Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.148334 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl"] Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.171736 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h"] Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.172217 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wt8jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-f66b554c6-d7gbf_openstack-operators(74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.185112 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" podUID="4d10ceb0-c730-4fb8-b81c-a87e33890f84" Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.184460 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6wfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6d776955-x6x67_openstack-operators(1b6d7bf0-00ff-41df-8873-cab7f6e5eeea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.187403 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" podUID="42bf0afd-961a-4353-9499-a185b16b8a02" Sep 30 12:36:18 crc kubenswrapper[4672]: W0930 12:36:18.202433 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf21b89f_fcd8_4854_954b_06927bc7c6ea.slice/crio-3c34b8bf21a657be61ee1b78fb745f28b58d74fdaf9077ab831ac50ea2545fa4 WatchSource:0}: Error finding container 3c34b8bf21a657be61ee1b78fb745f28b58d74fdaf9077ab831ac50ea2545fa4: Status 404 returned error can't find the container with id 3c34b8bf21a657be61ee1b78fb745f28b58d74fdaf9077ab831ac50ea2545fa4 Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.249185 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" podUID="5a756e54-c5bf-480b-aa89-57ca440d1ddc" Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.308084 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kjxxb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-qxq7h_openstack-operators(e6b8eb11-36d8-45c1-b600-76ffff076b78): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.421037 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" event={"ID":"cf21b89f-fcd8-4854-954b-06927bc7c6ea","Type":"ContainerStarted","Data":"3c34b8bf21a657be61ee1b78fb745f28b58d74fdaf9077ab831ac50ea2545fa4"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.458457 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" event={"ID":"9aef7bd6-dab2-4333-b248-a40c44bc3743","Type":"ContainerStarted","Data":"5b9f7c0ea8184ece6f0da5f7a229014098d7a0b77889162f317cd503b358d441"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.496683 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" event={"ID":"4f1099dc-e100-44d5-8d17-255dbe0edf63","Type":"ContainerStarted","Data":"e4afaf1250c063127f6f582417a7f1133bd8c97f51a116faee0ac84b4aa5292d"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.501790 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" event={"ID":"711b218e-6a25-4d4f-b657-e621e9d1d658","Type":"ContainerStarted","Data":"cbc3a0ab20f177b3433eef82baba1a2424ce3b48c6f716bac30eb92329e472a7"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.502957 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" event={"ID":"32830807-0fb2-4545-a629-af52b20e0b0f","Type":"ContainerStarted","Data":"5ee2110fac1e254063fc54d821b8379eb34db992fa90d50e37cfa9e61faf30b3"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.504640 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" event={"ID":"5a756e54-c5bf-480b-aa89-57ca440d1ddc","Type":"ContainerStarted","Data":"da6cf882163f2665844563576d3143df71d9b828261ebda567c554ab4038bc59"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.504677 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" event={"ID":"5a756e54-c5bf-480b-aa89-57ca440d1ddc","Type":"ContainerStarted","Data":"51ecb77a15977c39fe720f07dcbb54a550f8757bd8d6031aa9767ed2404b2083"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.506359 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" event={"ID":"d4c88a65-e12f-4872-baf2-f210ee1b0c9a","Type":"ContainerStarted","Data":"f80c04379268f848903cef289af94cbc590c33c8fbbb7c6cb2d0785d4961107c"} Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.511544 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" podUID="5a756e54-c5bf-480b-aa89-57ca440d1ddc" Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.513833 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" event={"ID":"6fa26cab-ae65-4e21-af16-2628c86be254","Type":"ContainerStarted","Data":"b8641489b4f1a80abe85ec2e3bfd4e62e5d2a3c4dfb4b5a94d606c7a4300a406"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.516559 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" event={"ID":"4d10ceb0-c730-4fb8-b81c-a87e33890f84","Type":"ContainerStarted","Data":"51cab8da38c7302e0afb8275fc0b3ce65ffa04bb2fab4212e9cd0fc3dc27f777"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.516592 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" event={"ID":"4d10ceb0-c730-4fb8-b81c-a87e33890f84","Type":"ContainerStarted","Data":"d14c80414331114a7315da6fd67203d1e601f1d5daa22486982ba5b4308398db"} Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.519997 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.83:5001/openstack-k8s-operators/watcher-operator:3f7eac74f76f24bd9da5f9316305cdeb8d1ba29c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" podUID="4d10ceb0-c730-4fb8-b81c-a87e33890f84" Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.521961 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" event={"ID":"3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06","Type":"ContainerStarted","Data":"de4da94f60372c46893021bd6f3ecd321ad654ccdcf6c29d1478768b1a532276"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.525527 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" event={"ID":"904a2d6e-693a-4c5e-926e-2c5fd47d6bea","Type":"ContainerStarted","Data":"c9ebc45e52481f671480a241b58fd3de2e9d8a92a08df7eea28e9b291955f891"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.528492 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" event={"ID":"74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7","Type":"ContainerStarted","Data":"5915b1fe67854ce371d3af5a9950997567dfc5e67bec3802dadfa1cc70d99456"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.544730 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" event={"ID":"6cc4cd4e-abd0-4318-bfd4-e2df45940139","Type":"ContainerStarted","Data":"2399273db6d46760e2e3f921b49702f551f3b041b931baf9ef7e33b3bc600340"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.579389 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" event={"ID":"b12c1847-2238-4a91-a2a0-4de492556fe7","Type":"ContainerStarted","Data":"eb1a32ac301ecd9837dcff29cffb4209f5a0cf03c3ded707fdcd3d586b0f8f9e"} Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.579525 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" podUID="74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7" Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.582191 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" event={"ID":"0e2c3398-4a1f-4a82-a95c-89e73d9a4485","Type":"ContainerStarted","Data":"38c04dd4f081beb42384ec9e9ffcefffba5c60eb4a456bf6802b446b168bc476"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.593087 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" event={"ID":"42bf0afd-961a-4353-9499-a185b16b8a02","Type":"ContainerStarted","Data":"02ba64eb2755c8335458d7314f21cccf4016f9caf021cc6c9868c3bccd3d16a9"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.593137 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" event={"ID":"42bf0afd-961a-4353-9499-a185b16b8a02","Type":"ContainerStarted","Data":"70aeb2632a6b7f1d10495661709c67d869ae6fc755a7abf6da572e70a0e7c8e8"} Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.594855 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" podUID="42bf0afd-961a-4353-9499-a185b16b8a02" Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.595585 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" event={"ID":"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea","Type":"ContainerStarted","Data":"a07a660190cfd6bbac2837c42f27a95785599d8a0688d9724854c46d2f060346"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.611514 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" event={"ID":"e6b8eb11-36d8-45c1-b600-76ffff076b78","Type":"ContainerStarted","Data":"1e6fdb1193ebe3ef6cf888d7205b3047f64b68d12537f46ac81178f92df76d8d"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.624597 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" event={"ID":"601fcd4a-dc2f-468d-9ad6-6b173320c317","Type":"ContainerStarted","Data":"326091025a5e509cbd7a224c0d2ab26836e9422eae3d48bdd19ed7eeb3965572"} Sep 30 12:36:18 crc kubenswrapper[4672]: I0930 12:36:18.631757 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" event={"ID":"ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0","Type":"ContainerStarted","Data":"112e546c23820f51c22d4f6f0d417442666f6326e3d2e5a740abcb7ae8eaf3a2"} Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.650711 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" podUID="1b6d7bf0-00ff-41df-8873-cab7f6e5eeea" Sep 30 12:36:18 crc kubenswrapper[4672]: E0930 12:36:18.738110 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" podUID="e6b8eb11-36d8-45c1-b600-76ffff076b78" Sep 30 12:36:19 crc kubenswrapper[4672]: I0930 12:36:19.658960 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" event={"ID":"711b218e-6a25-4d4f-b657-e621e9d1d658","Type":"ContainerStarted","Data":"5675299cda3c58890f0e94f125dc3315ccf98e64df02d5d3f26f43b8b325f233"} Sep 30 12:36:19 crc kubenswrapper[4672]: I0930 12:36:19.659032 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" event={"ID":"711b218e-6a25-4d4f-b657-e621e9d1d658","Type":"ContainerStarted","Data":"14b40314539d94d2f8a11e34bb9675a5ca3f21588945250b637f0221dabbced4"} Sep 30 12:36:19 crc kubenswrapper[4672]: I0930 12:36:19.659828 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:19 crc kubenswrapper[4672]: I0930 12:36:19.677815 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" event={"ID":"74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7","Type":"ContainerStarted","Data":"ea6d847b42e633c2aa3a66a1992b91d98a0fdb53ef4c4350cd3c1aab8c26a66a"} Sep 30 12:36:19 crc kubenswrapper[4672]: E0930 12:36:19.681343 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" podUID="74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7" Sep 30 12:36:19 crc kubenswrapper[4672]: I0930 12:36:19.682524 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" event={"ID":"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea","Type":"ContainerStarted","Data":"d2b8d95cbd27cb81f843eb9850074daa6417297f05b554d241f942b2e4e17d4a"} Sep 30 12:36:19 crc kubenswrapper[4672]: E0930 12:36:19.683792 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" podUID="1b6d7bf0-00ff-41df-8873-cab7f6e5eeea" Sep 30 12:36:19 crc kubenswrapper[4672]: I0930 12:36:19.685806 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" event={"ID":"e6b8eb11-36d8-45c1-b600-76ffff076b78","Type":"ContainerStarted","Data":"e4b97dda323f8dc38c7516f51eb0a6ac52bfa551f5954d598f886b2ca61b19fe"} Sep 30 12:36:19 crc kubenswrapper[4672]: E0930 12:36:19.696102 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" podUID="e6b8eb11-36d8-45c1-b600-76ffff076b78" Sep 30 12:36:19 crc kubenswrapper[4672]: E0930 12:36:19.697102 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" podUID="42bf0afd-961a-4353-9499-a185b16b8a02" Sep 30 12:36:19 crc kubenswrapper[4672]: E0930 12:36:19.697526 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.83:5001/openstack-k8s-operators/watcher-operator:3f7eac74f76f24bd9da5f9316305cdeb8d1ba29c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" podUID="4d10ceb0-c730-4fb8-b81c-a87e33890f84" Sep 30 12:36:19 crc kubenswrapper[4672]: E0930 12:36:19.698842 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" podUID="5a756e54-c5bf-480b-aa89-57ca440d1ddc" Sep 30 12:36:20 crc kubenswrapper[4672]: I0930 12:36:20.126710 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" podStartSLOduration=5.126688132 podStartE2EDuration="5.126688132s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:36:20.121653555 +0000 UTC m=+871.390891221" watchObservedRunningTime="2025-09-30 12:36:20.126688132 +0000 UTC m=+871.395925778" Sep 30 12:36:20 crc kubenswrapper[4672]: E0930 12:36:20.702504 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" podUID="1b6d7bf0-00ff-41df-8873-cab7f6e5eeea" Sep 30 12:36:20 crc kubenswrapper[4672]: E0930 12:36:20.703979 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" podUID="74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7" Sep 30 12:36:20 crc kubenswrapper[4672]: E0930 12:36:20.705384 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" podUID="e6b8eb11-36d8-45c1-b600-76ffff076b78" Sep 30 12:36:26 crc kubenswrapper[4672]: I0930 12:36:26.929421 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6bd96d9cc5-cf4sl" Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.808997 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" event={"ID":"904a2d6e-693a-4c5e-926e-2c5fd47d6bea","Type":"ContainerStarted","Data":"1c59dd93a67ffeab5bbe0b73aaf265e364ef46363ea2a64fabb061dd2ebe1ffc"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.818376 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" event={"ID":"c70c8b28-1f45-4c79-af69-3197c7f66fa0","Type":"ContainerStarted","Data":"0b2b0dc8271ba8267a0c6422e877c8cb68a69153f5d8b5a09bfbed154579f972"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.826063 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" event={"ID":"9aef7bd6-dab2-4333-b248-a40c44bc3743","Type":"ContainerStarted","Data":"4c88a0934ce027700fa648a2e397dcbcf8e129a510be89b81f0a0460896c48d0"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.835637 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" event={"ID":"601fcd4a-dc2f-468d-9ad6-6b173320c317","Type":"ContainerStarted","Data":"2668bb5bcc42d41c24a381a27150c9f53afab70ac785ddc19b6c9ffd2e5c07d7"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.871777 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" event={"ID":"ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0","Type":"ContainerStarted","Data":"44e7dfa946571888441a341f591657769f71011e2f242ff09b1c00abe78fc99c"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.892491 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" event={"ID":"d4c88a65-e12f-4872-baf2-f210ee1b0c9a","Type":"ContainerStarted","Data":"30741af1b5babe6c67d86524cd306c73433037bcb4d91e03b96d3fdfd8e73bce"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.916137 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" event={"ID":"ed25b409-1ca7-4fc4-95b5-55b4239233f3","Type":"ContainerStarted","Data":"81021043143ff81b55bf794666b6718ac28036e41afe4c2204dc38f504b2d6a5"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.926010 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" event={"ID":"cf21b89f-fcd8-4854-954b-06927bc7c6ea","Type":"ContainerStarted","Data":"94d342b0a36fa0d306f12fc03d2cfa27b167715e9ac96eb43b78bb1421c54475"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.932983 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" event={"ID":"b12c1847-2238-4a91-a2a0-4de492556fe7","Type":"ContainerStarted","Data":"357040a358d9040d73dace78c93907c076333da680830f56228e52c3c2ce0c71"} Sep 30 12:36:30 crc kubenswrapper[4672]: I0930 12:36:30.945534 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" event={"ID":"6cc4cd4e-abd0-4318-bfd4-e2df45940139","Type":"ContainerStarted","Data":"f0e3925c8ee04bc42488b8d87ea83d558adf98f9e80de3125d923d60adb79325"} Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.955665 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" event={"ID":"cf21b89f-fcd8-4854-954b-06927bc7c6ea","Type":"ContainerStarted","Data":"0bbc73cc360149a8628d9dbe7da00de15136a92ed3ab725baf521a3edc5d38dc"} Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.955841 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.958081 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" event={"ID":"b12c1847-2238-4a91-a2a0-4de492556fe7","Type":"ContainerStarted","Data":"8055d90259c8d228351a7c27a18fa1814fa4dc82f9b31ae3963f0f891c398e51"} Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.958167 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.960112 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" event={"ID":"601fcd4a-dc2f-468d-9ad6-6b173320c317","Type":"ContainerStarted","Data":"657f40c993cda0c4579fd33f1bf489959416f71a6544596e23d7654ba9cbf8c5"} Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.960546 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.965379 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" event={"ID":"ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0","Type":"ContainerStarted","Data":"275a1adafb723aa14c009b753f764ba4eee4e3734908ec08b8a959ca52a51aac"} Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.965910 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.974802 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" event={"ID":"3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06","Type":"ContainerStarted","Data":"6c1f8179ebc26010f3a69c8b0c6fe4c997836a34cc339258254cdee54672ce01"} Sep 30 12:36:31 crc kubenswrapper[4672]: I0930 12:36:31.996700 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" event={"ID":"4f1099dc-e100-44d5-8d17-255dbe0edf63","Type":"ContainerStarted","Data":"594346344dab8e6eecd194d8dc434070493c9987adc95c9e136b2674939ffa97"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.005110 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" event={"ID":"904a2d6e-693a-4c5e-926e-2c5fd47d6bea","Type":"ContainerStarted","Data":"372882acc26fdeab4d679da00ee83cd199c4858fe8a1610506bcd80cb48b5807"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.005743 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.010348 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" podStartSLOduration=4.393688912 podStartE2EDuration="17.010334826s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.587745907 +0000 UTC m=+868.856983553" lastFinishedPulling="2025-09-30 12:36:30.204391801 +0000 UTC m=+881.473629467" observedRunningTime="2025-09-30 12:36:32.009970177 +0000 UTC m=+883.279207843" watchObservedRunningTime="2025-09-30 12:36:32.010334826 +0000 UTC m=+883.279572472" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.012467 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" event={"ID":"6fa26cab-ae65-4e21-af16-2628c86be254","Type":"ContainerStarted","Data":"8df69f8906e5883da08952fb33f8c1ef2325faabd6e902c8d5f30e296d221e00"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.013128 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" podStartSLOduration=5.018990333 podStartE2EDuration="17.013119937s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:18.208826091 +0000 UTC m=+869.478063737" lastFinishedPulling="2025-09-30 12:36:30.202955695 +0000 UTC m=+881.472193341" observedRunningTime="2025-09-30 12:36:31.984093034 +0000 UTC m=+883.253330680" watchObservedRunningTime="2025-09-30 12:36:32.013119937 +0000 UTC m=+883.282357583" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.025759 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" event={"ID":"d4c88a65-e12f-4872-baf2-f210ee1b0c9a","Type":"ContainerStarted","Data":"336d84bd9ce901608b161a555bb10e4926d806a1d7cad7749a9fe7d8d711ca9e"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.026518 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.033606 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" event={"ID":"32830807-0fb2-4545-a629-af52b20e0b0f","Type":"ContainerStarted","Data":"33501e7149d1def35a98ea02ee17895757ada74c569f317fddc13c6595fd1d76"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.045440 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" podStartSLOduration=5.456651912 podStartE2EDuration="18.045419222s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.619770056 +0000 UTC m=+868.889007702" lastFinishedPulling="2025-09-30 12:36:30.208537366 +0000 UTC m=+881.477775012" observedRunningTime="2025-09-30 12:36:32.041917874 +0000 UTC m=+883.311155520" watchObservedRunningTime="2025-09-30 12:36:32.045419222 +0000 UTC m=+883.314656868" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.050744 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" event={"ID":"892875f4-bce3-47cb-8478-9d6bbc819bb1","Type":"ContainerStarted","Data":"ba84c1cd88b7bc7f4c24bd82c95a81735ee6d0a9cae39aae61f764e4d68f6451"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.056888 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" event={"ID":"0e2c3398-4a1f-4a82-a95c-89e73d9a4485","Type":"ContainerStarted","Data":"480362e4a3a309721eff4b549cedb3ffe44ac29ca5e9ec7991f5cf7fb45e5eb3"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.058747 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" event={"ID":"6cc4cd4e-abd0-4318-bfd4-e2df45940139","Type":"ContainerStarted","Data":"d627618152bfb4fa97fbabc5cae5d0cd040ee8ab34db540d936fcd188de99f1d"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.059551 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.062597 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" event={"ID":"9aef7bd6-dab2-4333-b248-a40c44bc3743","Type":"ContainerStarted","Data":"66a7b6fd4539b7b8be2169904244eb64914b705eb6ff3b48ddccb7fce6fd4bcc"} Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.063128 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.071804 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" podStartSLOduration=4.528709092 podStartE2EDuration="17.071782048s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.664901326 +0000 UTC m=+868.934138972" lastFinishedPulling="2025-09-30 12:36:30.207974282 +0000 UTC m=+881.477211928" observedRunningTime="2025-09-30 12:36:32.066174846 +0000 UTC m=+883.335412492" watchObservedRunningTime="2025-09-30 12:36:32.071782048 +0000 UTC m=+883.341019704" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.106032 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" podStartSLOduration=5.558919466 podStartE2EDuration="18.106013463s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.665339657 +0000 UTC m=+868.934577303" lastFinishedPulling="2025-09-30 12:36:30.212433624 +0000 UTC m=+881.481671300" observedRunningTime="2025-09-30 12:36:32.100727289 +0000 UTC m=+883.369964935" watchObservedRunningTime="2025-09-30 12:36:32.106013463 +0000 UTC m=+883.375251109" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.131737 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" podStartSLOduration=4.608933248 podStartE2EDuration="17.131715902s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.673712758 +0000 UTC m=+868.942950404" lastFinishedPulling="2025-09-30 12:36:30.196495412 +0000 UTC m=+881.465733058" observedRunningTime="2025-09-30 12:36:32.129776563 +0000 UTC m=+883.399014209" watchObservedRunningTime="2025-09-30 12:36:32.131715902 +0000 UTC m=+883.400953548" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.207285 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" podStartSLOduration=4.614892858 podStartE2EDuration="17.207234119s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.617336654 +0000 UTC m=+868.886574300" lastFinishedPulling="2025-09-30 12:36:30.209677915 +0000 UTC m=+881.478915561" observedRunningTime="2025-09-30 12:36:32.160995971 +0000 UTC m=+883.430233637" watchObservedRunningTime="2025-09-30 12:36:32.207234119 +0000 UTC m=+883.476471765" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.212756 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-qxpkk" podStartSLOduration=4.754777171 podStartE2EDuration="17.212736398s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.788438215 +0000 UTC m=+869.057675861" lastFinishedPulling="2025-09-30 12:36:30.246397452 +0000 UTC m=+881.515635088" observedRunningTime="2025-09-30 12:36:32.198301123 +0000 UTC m=+883.467538769" watchObservedRunningTime="2025-09-30 12:36:32.212736398 +0000 UTC m=+883.481974044" Sep 30 12:36:32 crc kubenswrapper[4672]: I0930 12:36:32.238083 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" podStartSLOduration=5.703154437 podStartE2EDuration="18.238065657s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.673536754 +0000 UTC m=+868.942774400" lastFinishedPulling="2025-09-30 12:36:30.208447964 +0000 UTC m=+881.477685620" observedRunningTime="2025-09-30 12:36:32.237466012 +0000 UTC m=+883.506703658" watchObservedRunningTime="2025-09-30 12:36:32.238065657 +0000 UTC m=+883.507303303" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.074102 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" event={"ID":"c70c8b28-1f45-4c79-af69-3197c7f66fa0","Type":"ContainerStarted","Data":"091e7e5a0ce5a1bc20d829926fe9f14b149aeffb7c2a3d3b45575454fa77a45c"} Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.074211 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.077004 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" event={"ID":"6fa26cab-ae65-4e21-af16-2628c86be254","Type":"ContainerStarted","Data":"ad53faf29445fa0eefd4c4ba71fe46ef4387b691ef453de8a7f8794f68fa46d0"} Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.077127 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.087887 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" event={"ID":"ed25b409-1ca7-4fc4-95b5-55b4239233f3","Type":"ContainerStarted","Data":"87c250768bbcb5f6b621f4c61fb3048ff90b8bffe589e7ffeaf8b6d74d452ea4"} Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.088124 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.092408 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" event={"ID":"0e2c3398-4a1f-4a82-a95c-89e73d9a4485","Type":"ContainerStarted","Data":"aa68a896c71594748f516c75b30bd5adb7650a96f0b789558ea728fdc163de22"} Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.092986 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.097155 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" event={"ID":"892875f4-bce3-47cb-8478-9d6bbc819bb1","Type":"ContainerStarted","Data":"ada72e465b9ccd85a45e398efa9a41543b6fb7bfea6f2680ba7c88f67f5ff934"} Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.097950 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.104734 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" event={"ID":"3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06","Type":"ContainerStarted","Data":"ba9fc26645a7d01516a71c7d44d3f992b9115baf22b1927e0f9802bd0159eafc"} Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.105353 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.106991 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" podStartSLOduration=5.413756469 podStartE2EDuration="19.106974459s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:16.518553687 +0000 UTC m=+867.787791333" lastFinishedPulling="2025-09-30 12:36:30.211771677 +0000 UTC m=+881.481009323" observedRunningTime="2025-09-30 12:36:33.091687953 +0000 UTC m=+884.360925599" watchObservedRunningTime="2025-09-30 12:36:33.106974459 +0000 UTC m=+884.376212095" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.112101 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" podStartSLOduration=7.054234296 podStartE2EDuration="19.112083338s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:18.159303271 +0000 UTC m=+869.428540917" lastFinishedPulling="2025-09-30 12:36:30.217152313 +0000 UTC m=+881.486389959" observedRunningTime="2025-09-30 12:36:33.111054762 +0000 UTC m=+884.380292418" watchObservedRunningTime="2025-09-30 12:36:33.112083338 +0000 UTC m=+884.381320984" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.112689 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" event={"ID":"32830807-0fb2-4545-a629-af52b20e0b0f","Type":"ContainerStarted","Data":"70d46dfe65cbf1ebbf5407dce27799a3f3a75741920f8fb1ba298acd3995425a"} Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.112747 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.134311 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" podStartSLOduration=5.8101331080000005 podStartE2EDuration="19.134238427s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:16.888614722 +0000 UTC m=+868.157852368" lastFinishedPulling="2025-09-30 12:36:30.212720051 +0000 UTC m=+881.481957687" observedRunningTime="2025-09-30 12:36:33.128381409 +0000 UTC m=+884.397619055" watchObservedRunningTime="2025-09-30 12:36:33.134238427 +0000 UTC m=+884.403476073" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.150992 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" podStartSLOduration=6.611776152 podStartE2EDuration="19.15096751s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.673714778 +0000 UTC m=+868.942952434" lastFinishedPulling="2025-09-30 12:36:30.212906146 +0000 UTC m=+881.482143792" observedRunningTime="2025-09-30 12:36:33.143939512 +0000 UTC m=+884.413177168" watchObservedRunningTime="2025-09-30 12:36:33.15096751 +0000 UTC m=+884.420205156" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.168053 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" podStartSLOduration=5.47835428 podStartE2EDuration="19.168030921s" podCreationTimestamp="2025-09-30 12:36:14 +0000 UTC" firstStartedPulling="2025-09-30 12:36:16.518838464 +0000 UTC m=+867.788076110" lastFinishedPulling="2025-09-30 12:36:30.208515105 +0000 UTC m=+881.477752751" observedRunningTime="2025-09-30 12:36:33.159750061 +0000 UTC m=+884.428987727" watchObservedRunningTime="2025-09-30 12:36:33.168030921 +0000 UTC m=+884.437268567" Sep 30 12:36:33 crc kubenswrapper[4672]: I0930 12:36:33.186061 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" podStartSLOduration=5.647321639 podStartE2EDuration="18.186032185s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.673542624 +0000 UTC m=+868.942780270" lastFinishedPulling="2025-09-30 12:36:30.21225317 +0000 UTC m=+881.481490816" observedRunningTime="2025-09-30 12:36:33.177204292 +0000 UTC m=+884.446441938" watchObservedRunningTime="2025-09-30 12:36:33.186032185 +0000 UTC m=+884.455269841" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.129240 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" event={"ID":"1b6d7bf0-00ff-41df-8873-cab7f6e5eeea","Type":"ContainerStarted","Data":"5092bf8dacaa0e454a8f0a13fff4888031915552df7db2334e5e956d4537a033"} Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.130944 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" event={"ID":"74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7","Type":"ContainerStarted","Data":"a1d6d08966197243b7a136bcffbb281d5eda35f36e23c871b7268f3ef5f4e118"} Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.131571 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.135316 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" event={"ID":"42bf0afd-961a-4353-9499-a185b16b8a02","Type":"ContainerStarted","Data":"de20202ae929d634781eb19811c0ff02699acf3d702310c91245bf9b24a39b2d"} Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.136771 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.144851 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-fn2cb" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.149581 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-xq89z" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.154812 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" podStartSLOduration=7.585201717 podStartE2EDuration="20.154794282s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.63337856 +0000 UTC m=+868.902616206" lastFinishedPulling="2025-09-30 12:36:30.202971105 +0000 UTC m=+881.472208771" observedRunningTime="2025-09-30 12:36:33.19415458 +0000 UTC m=+884.463392226" watchObservedRunningTime="2025-09-30 12:36:35.154794282 +0000 UTC m=+886.424031928" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.156105 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" podStartSLOduration=3.785206046 podStartE2EDuration="20.156097735s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:18.172039312 +0000 UTC m=+869.441276958" lastFinishedPulling="2025-09-30 12:36:34.542931001 +0000 UTC m=+885.812168647" observedRunningTime="2025-09-30 12:36:35.150834762 +0000 UTC m=+886.420072398" watchObservedRunningTime="2025-09-30 12:36:35.156097735 +0000 UTC m=+886.425335381" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.191395 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" podStartSLOduration=3.415908981 podStartE2EDuration="20.191380106s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.784696651 +0000 UTC m=+869.053934297" lastFinishedPulling="2025-09-30 12:36:34.560167766 +0000 UTC m=+885.829405422" observedRunningTime="2025-09-30 12:36:35.189860248 +0000 UTC m=+886.459097894" watchObservedRunningTime="2025-09-30 12:36:35.191380106 +0000 UTC m=+886.460617742" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.267686 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" podStartSLOduration=3.900645902 podStartE2EDuration="20.267667443s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:18.172293129 +0000 UTC m=+869.441530775" lastFinishedPulling="2025-09-30 12:36:34.53931467 +0000 UTC m=+885.808552316" observedRunningTime="2025-09-30 12:36:35.255114666 +0000 UTC m=+886.524352312" watchObservedRunningTime="2025-09-30 12:36:35.267667443 +0000 UTC m=+886.536905089" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.301892 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-jmkcr" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.302047 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-df89s" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.346565 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-gthb7" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.372968 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-vs4mr" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.407513 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-z9tmm" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.506255 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-g8lll" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.582331 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-vqxhh" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.670677 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-2hvxs" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.720049 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-lpv98" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.795699 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-2kvn7" Sep 30 12:36:35 crc kubenswrapper[4672]: I0930 12:36:35.830614 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-hjn2l" Sep 30 12:36:36 crc kubenswrapper[4672]: I0930 12:36:36.080627 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-8qftk" Sep 30 12:36:36 crc kubenswrapper[4672]: I0930 12:36:36.226912 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-8mf7d" Sep 30 12:36:37 crc kubenswrapper[4672]: I0930 12:36:37.155417 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" event={"ID":"5a756e54-c5bf-480b-aa89-57ca440d1ddc","Type":"ContainerStarted","Data":"0d8cbeda666d51c122a8b1792a043f4aa92844b62f45b812df58f579f73b63d4"} Sep 30 12:36:37 crc kubenswrapper[4672]: I0930 12:36:37.155647 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" Sep 30 12:36:37 crc kubenswrapper[4672]: I0930 12:36:37.160842 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" event={"ID":"4d10ceb0-c730-4fb8-b81c-a87e33890f84","Type":"ContainerStarted","Data":"6ab7c14cda632b9b98edbae006beca05123fa8ea91c4ea0d908866012e8d1d3b"} Sep 30 12:36:37 crc kubenswrapper[4672]: I0930 12:36:37.161101 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" Sep 30 12:36:37 crc kubenswrapper[4672]: I0930 12:36:37.175184 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" podStartSLOduration=3.324580805 podStartE2EDuration="22.175170432s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.822958177 +0000 UTC m=+869.092195823" lastFinishedPulling="2025-09-30 12:36:36.673547804 +0000 UTC m=+887.942785450" observedRunningTime="2025-09-30 12:36:37.169847277 +0000 UTC m=+888.439084943" watchObservedRunningTime="2025-09-30 12:36:37.175170432 +0000 UTC m=+888.444408078" Sep 30 12:36:37 crc kubenswrapper[4672]: I0930 12:36:37.198997 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" podStartSLOduration=3.293687385 podStartE2EDuration="22.198979083s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:17.774395021 +0000 UTC m=+869.043632677" lastFinishedPulling="2025-09-30 12:36:36.679686719 +0000 UTC m=+887.948924375" observedRunningTime="2025-09-30 12:36:37.197425934 +0000 UTC m=+888.466663580" watchObservedRunningTime="2025-09-30 12:36:37.198979083 +0000 UTC m=+888.468216729" Sep 30 12:36:37 crc kubenswrapper[4672]: I0930 12:36:37.626462 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:36:39 crc kubenswrapper[4672]: I0930 12:36:39.188696 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" event={"ID":"e6b8eb11-36d8-45c1-b600-76ffff076b78","Type":"ContainerStarted","Data":"ee0b5d6438afb2e9f7b44a800d0b67d0041cbc78d8f6bc349377d8a80ae14b7e"} Sep 30 12:36:39 crc kubenswrapper[4672]: I0930 12:36:39.189395 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" Sep 30 12:36:46 crc kubenswrapper[4672]: I0930 12:36:46.165438 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-zxb4j" Sep 30 12:36:46 crc kubenswrapper[4672]: I0930 12:36:46.191201 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-r9sqd" Sep 30 12:36:46 crc kubenswrapper[4672]: I0930 12:36:46.197835 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" podStartSLOduration=11.506810558 podStartE2EDuration="31.197808728s" podCreationTimestamp="2025-09-30 12:36:15 +0000 UTC" firstStartedPulling="2025-09-30 12:36:18.307960585 +0000 UTC m=+869.577198231" lastFinishedPulling="2025-09-30 12:36:37.998958755 +0000 UTC m=+889.268196401" observedRunningTime="2025-09-30 12:36:39.211829633 +0000 UTC m=+890.481067299" watchObservedRunningTime="2025-09-30 12:36:46.197808728 +0000 UTC m=+897.467046414" Sep 30 12:36:46 crc kubenswrapper[4672]: I0930 12:36:46.247385 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-qxq7h" Sep 30 12:36:46 crc kubenswrapper[4672]: I0930 12:36:46.270648 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-d7gbf" Sep 30 12:36:46 crc kubenswrapper[4672]: I0930 12:36:46.307562 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-58675bf858-qg9s4" Sep 30 12:36:47 crc kubenswrapper[4672]: I0930 12:36:47.636832 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-x6x67" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.075354 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698c778d7-shzg2"] Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.077589 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.081531 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.081599 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.081700 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bwrdn" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.081826 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.090667 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698c778d7-shzg2"] Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.098584 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f7f949-c461-4cd4-9aec-2cfd4507475e-config\") pod \"dnsmasq-dns-698c778d7-shzg2\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.098905 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kmb\" (UniqueName: \"kubernetes.io/projected/83f7f949-c461-4cd4-9aec-2cfd4507475e-kube-api-access-67kmb\") pod \"dnsmasq-dns-698c778d7-shzg2\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.151188 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dcdf6f545-4hkf4"] Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.152842 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.158452 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.166759 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcdf6f545-4hkf4"] Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.199778 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67kmb\" (UniqueName: \"kubernetes.io/projected/83f7f949-c461-4cd4-9aec-2cfd4507475e-kube-api-access-67kmb\") pod \"dnsmasq-dns-698c778d7-shzg2\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.199850 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-dns-svc\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.199886 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-config\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.199911 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f7f949-c461-4cd4-9aec-2cfd4507475e-config\") pod \"dnsmasq-dns-698c778d7-shzg2\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.199930 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dzg\" (UniqueName: \"kubernetes.io/projected/976b5844-b276-466e-a961-99fd518cf5f4-kube-api-access-z4dzg\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.200909 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f7f949-c461-4cd4-9aec-2cfd4507475e-config\") pod \"dnsmasq-dns-698c778d7-shzg2\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.222109 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kmb\" (UniqueName: \"kubernetes.io/projected/83f7f949-c461-4cd4-9aec-2cfd4507475e-kube-api-access-67kmb\") pod \"dnsmasq-dns-698c778d7-shzg2\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.301346 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-config\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.301401 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4dzg\" (UniqueName: \"kubernetes.io/projected/976b5844-b276-466e-a961-99fd518cf5f4-kube-api-access-z4dzg\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.301496 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-dns-svc\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.302370 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-dns-svc\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.302541 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-config\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.320020 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4dzg\" (UniqueName: \"kubernetes.io/projected/976b5844-b276-466e-a961-99fd518cf5f4-kube-api-access-z4dzg\") pod \"dnsmasq-dns-6dcdf6f545-4hkf4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.402206 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.469537 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.863483 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698c778d7-shzg2"] Sep 30 12:37:07 crc kubenswrapper[4672]: W0930 12:37:07.865599 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83f7f949_c461_4cd4_9aec_2cfd4507475e.slice/crio-67766e95130f93f049eda10f0cfcc67bceba5db4f76236083412e556549f7f14 WatchSource:0}: Error finding container 67766e95130f93f049eda10f0cfcc67bceba5db4f76236083412e556549f7f14: Status 404 returned error can't find the container with id 67766e95130f93f049eda10f0cfcc67bceba5db4f76236083412e556549f7f14 Sep 30 12:37:07 crc kubenswrapper[4672]: I0930 12:37:07.867392 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 12:37:08 crc kubenswrapper[4672]: I0930 12:37:08.020822 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcdf6f545-4hkf4"] Sep 30 12:37:08 crc kubenswrapper[4672]: W0930 12:37:08.033466 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod976b5844_b276_466e_a961_99fd518cf5f4.slice/crio-ce57317930d9dee76925981bd9f80bc2efed304a438f8ab87a7b76a06ee9b6b7 WatchSource:0}: Error finding container ce57317930d9dee76925981bd9f80bc2efed304a438f8ab87a7b76a06ee9b6b7: Status 404 returned error can't find the container with id ce57317930d9dee76925981bd9f80bc2efed304a438f8ab87a7b76a06ee9b6b7 Sep 30 12:37:08 crc kubenswrapper[4672]: I0930 12:37:08.436957 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" event={"ID":"976b5844-b276-466e-a961-99fd518cf5f4","Type":"ContainerStarted","Data":"ce57317930d9dee76925981bd9f80bc2efed304a438f8ab87a7b76a06ee9b6b7"} Sep 30 12:37:08 crc kubenswrapper[4672]: I0930 12:37:08.452099 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698c778d7-shzg2" event={"ID":"83f7f949-c461-4cd4-9aec-2cfd4507475e","Type":"ContainerStarted","Data":"67766e95130f93f049eda10f0cfcc67bceba5db4f76236083412e556549f7f14"} Sep 30 12:37:10 crc kubenswrapper[4672]: I0930 12:37:10.928448 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698c778d7-shzg2"] Sep 30 12:37:10 crc kubenswrapper[4672]: I0930 12:37:10.965683 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54d74754f7-6zpjc"] Sep 30 12:37:10 crc kubenswrapper[4672]: I0930 12:37:10.966902 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:10 crc kubenswrapper[4672]: I0930 12:37:10.996224 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54d74754f7-6zpjc"] Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.166032 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-dns-svc\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.166101 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-config\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.166131 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llqhv\" (UniqueName: \"kubernetes.io/projected/fe558b41-786d-42ef-9de2-d0865c8afb44-kube-api-access-llqhv\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.267237 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-dns-svc\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.267390 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-config\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.267423 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llqhv\" (UniqueName: \"kubernetes.io/projected/fe558b41-786d-42ef-9de2-d0865c8afb44-kube-api-access-llqhv\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.268728 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-dns-svc\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.268745 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-config\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.297319 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llqhv\" (UniqueName: \"kubernetes.io/projected/fe558b41-786d-42ef-9de2-d0865c8afb44-kube-api-access-llqhv\") pod \"dnsmasq-dns-54d74754f7-6zpjc\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.430516 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dcdf6f545-4hkf4"] Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.443371 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fffdb5d5-mx5lk"] Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.445111 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.524730 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fffdb5d5-mx5lk"] Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.573486 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n27lk\" (UniqueName: \"kubernetes.io/projected/148b676d-4484-44e1-8cb1-7136fdc07313-kube-api-access-n27lk\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.573581 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-dns-svc\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.573604 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-config\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.585616 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.674899 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n27lk\" (UniqueName: \"kubernetes.io/projected/148b676d-4484-44e1-8cb1-7136fdc07313-kube-api-access-n27lk\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.675002 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-dns-svc\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.675034 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-config\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.676356 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-config\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.677114 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-dns-svc\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.716187 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n27lk\" (UniqueName: \"kubernetes.io/projected/148b676d-4484-44e1-8cb1-7136fdc07313-kube-api-access-n27lk\") pod \"dnsmasq-dns-5fffdb5d5-mx5lk\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.761309 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54d74754f7-6zpjc"] Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.776609 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.777201 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dff579849-cdrgm"] Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.779858 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.793205 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dff579849-cdrgm"] Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.878063 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-config\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.878151 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-dns-svc\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.878264 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxqj\" (UniqueName: \"kubernetes.io/projected/9aae022e-aded-4f55-bfa2-9fa792516aac-kube-api-access-gzxqj\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.983079 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-config\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.983916 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-dns-svc\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.983947 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxqj\" (UniqueName: \"kubernetes.io/projected/9aae022e-aded-4f55-bfa2-9fa792516aac-kube-api-access-gzxqj\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.985113 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-config\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:11 crc kubenswrapper[4672]: I0930 12:37:11.985843 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-dns-svc\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.010315 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxqj\" (UniqueName: \"kubernetes.io/projected/9aae022e-aded-4f55-bfa2-9fa792516aac-kube-api-access-gzxqj\") pod \"dnsmasq-dns-5dff579849-cdrgm\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.122572 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.185325 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.186790 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.189674 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wqzrm" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.189892 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.190062 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.190240 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.190623 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.191234 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.192081 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.193748 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.216426 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54d74754f7-6zpjc"] Sep 30 12:37:12 crc kubenswrapper[4672]: W0930 12:37:12.220861 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe558b41_786d_42ef_9de2_d0865c8afb44.slice/crio-07ab8c6b5d4d1f2b22e4a19f0aa7ebe0b280f1bee0b537703872bd932dbf3c8c WatchSource:0}: Error finding container 07ab8c6b5d4d1f2b22e4a19f0aa7ebe0b280f1bee0b537703872bd932dbf3c8c: Status 404 returned error can't find the container with id 07ab8c6b5d4d1f2b22e4a19f0aa7ebe0b280f1bee0b537703872bd932dbf3c8c Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288555 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scd4z\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-kube-api-access-scd4z\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288601 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288628 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d795c24-8697-461f-9322-2c23bf7cb49b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288653 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288693 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288727 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288741 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d795c24-8697-461f-9322-2c23bf7cb49b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288760 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288779 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288792 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.288811 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.352304 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fffdb5d5-mx5lk"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391201 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391415 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391444 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d795c24-8697-461f-9322-2c23bf7cb49b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391473 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391494 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391513 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391535 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391578 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scd4z\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-kube-api-access-scd4z\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391613 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391639 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d795c24-8697-461f-9322-2c23bf7cb49b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.391666 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.394269 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.394725 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.394837 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.395319 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.395877 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.396555 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.399826 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d795c24-8697-461f-9322-2c23bf7cb49b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.404048 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d795c24-8697-461f-9322-2c23bf7cb49b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.410213 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.413364 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scd4z\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-kube-api-access-scd4z\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.413930 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.422300 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.513344 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.532087 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" event={"ID":"fe558b41-786d-42ef-9de2-d0865c8afb44","Type":"ContainerStarted","Data":"07ab8c6b5d4d1f2b22e4a19f0aa7ebe0b280f1bee0b537703872bd932dbf3c8c"} Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.577743 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.582040 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.586805 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.588733 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.588785 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-4ltb9" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.588753 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.588964 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.588976 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.589149 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.589359 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.596411 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dff579849-cdrgm"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695588 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695668 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xc2s\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-kube-api-access-7xc2s\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695694 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695732 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695760 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695806 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d165a3a8-6809-46e5-bd35-895200ab5bfc-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695829 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695859 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695877 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695898 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d165a3a8-6809-46e5-bd35-895200ab5bfc-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.695915 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d165a3a8-6809-46e5-bd35-895200ab5bfc-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797518 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797564 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797579 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xc2s\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-kube-api-access-7xc2s\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797602 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797634 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797654 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797688 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d165a3a8-6809-46e5-bd35-895200ab5bfc-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797711 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797737 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.797757 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.798135 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.798980 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.799117 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.800031 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.800403 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.801342 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d165a3a8-6809-46e5-bd35-895200ab5bfc-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.802996 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.803079 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d165a3a8-6809-46e5-bd35-895200ab5bfc-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.804781 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.812017 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d165a3a8-6809-46e5-bd35-895200ab5bfc-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.820162 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xc2s\" (UniqueName: \"kubernetes.io/projected/d165a3a8-6809-46e5-bd35-895200ab5bfc-kube-api-access-7xc2s\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.827573 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d165a3a8-6809-46e5-bd35-895200ab5bfc\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.890930 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.892873 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.902589 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.902752 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.903008 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.903175 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.903404 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.907680 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6fkml" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.907908 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.916483 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:37:12 crc kubenswrapper[4672]: I0930 12:37:12.917171 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.000913 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.000978 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzb8p\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-kube-api-access-jzb8p\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001083 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001177 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001239 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001317 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aea18e8-190e-470a-9330-a30621c96afd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001352 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001409 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-config-data\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001452 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001516 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aea18e8-190e-470a-9330-a30621c96afd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.001537 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102375 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aea18e8-190e-470a-9330-a30621c96afd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102425 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102462 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-config-data\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102498 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102528 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aea18e8-190e-470a-9330-a30621c96afd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102543 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102559 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102583 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzb8p\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-kube-api-access-jzb8p\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102602 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102631 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.102657 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.104713 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.105445 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.105529 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.105584 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.105947 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-config-data\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.106205 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.106551 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aea18e8-190e-470a-9330-a30621c96afd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.106946 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.108903 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.112304 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aea18e8-190e-470a-9330-a30621c96afd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.122381 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzb8p\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-kube-api-access-jzb8p\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.128115 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " pod="openstack/rabbitmq-server-0" Sep 30 12:37:13 crc kubenswrapper[4672]: I0930 12:37:13.265742 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.897018 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.900246 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.904771 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.905757 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.905885 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-t5r75" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.905992 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.906089 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.914003 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.917944 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.955865 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-secrets\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.955965 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62q2k\" (UniqueName: \"kubernetes.io/projected/056e0424-1faf-4d5a-8aea-e351214b3394-kube-api-access-62q2k\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.955998 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-operator-scripts\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.956020 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.956042 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/056e0424-1faf-4d5a-8aea-e351214b3394-config-data-generated\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.956076 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.956097 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-config-data-default\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.956118 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:14 crc kubenswrapper[4672]: I0930 12:37:14.956156 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-kolla-config\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.058356 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-kolla-config\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.058980 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-kolla-config\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059713 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-secrets\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059787 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62q2k\" (UniqueName: \"kubernetes.io/projected/056e0424-1faf-4d5a-8aea-e351214b3394-kube-api-access-62q2k\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059813 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-operator-scripts\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059839 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059875 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/056e0424-1faf-4d5a-8aea-e351214b3394-config-data-generated\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059932 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059964 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-config-data-default\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.059978 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.060988 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.061455 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-config-data-default\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.061799 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/056e0424-1faf-4d5a-8aea-e351214b3394-operator-scripts\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.064101 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/056e0424-1faf-4d5a-8aea-e351214b3394-config-data-generated\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.066408 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-secrets\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.071468 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.082718 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62q2k\" (UniqueName: \"kubernetes.io/projected/056e0424-1faf-4d5a-8aea-e351214b3394-kube-api-access-62q2k\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.089048 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056e0424-1faf-4d5a-8aea-e351214b3394-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.091532 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"056e0424-1faf-4d5a-8aea-e351214b3394\") " pod="openstack/openstack-galera-0" Sep 30 12:37:15 crc kubenswrapper[4672]: I0930 12:37:15.245917 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.312401 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.314872 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.320583 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-fssh5" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.321330 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.322745 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.323924 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.328976 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.388900 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.388955 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcrxk\" (UniqueName: \"kubernetes.io/projected/9159d76a-52b7-4262-a56a-ed28caec7f97-kube-api-access-lcrxk\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.388990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.389027 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.389054 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.389083 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9159d76a-52b7-4262-a56a-ed28caec7f97-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.389118 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.389147 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.389170 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492565 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492644 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492689 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9159d76a-52b7-4262-a56a-ed28caec7f97-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492750 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492783 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492806 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492852 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492877 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcrxk\" (UniqueName: \"kubernetes.io/projected/9159d76a-52b7-4262-a56a-ed28caec7f97-kube-api-access-lcrxk\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.492908 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.494015 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.494108 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.494365 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9159d76a-52b7-4262-a56a-ed28caec7f97-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.495023 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.495050 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9159d76a-52b7-4262-a56a-ed28caec7f97-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.500744 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.501119 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.524451 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9159d76a-52b7-4262-a56a-ed28caec7f97-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.532702 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcrxk\" (UniqueName: \"kubernetes.io/projected/9159d76a-52b7-4262-a56a-ed28caec7f97-kube-api-access-lcrxk\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.546531 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9159d76a-52b7-4262-a56a-ed28caec7f97\") " pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.641402 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.679694 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.681294 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.687761 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.688037 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pbcbs" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.688170 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.713558 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.799125 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e59a30-14b8-4736-87b1-9d9581094598-combined-ca-bundle\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.799172 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34e59a30-14b8-4736-87b1-9d9581094598-kolla-config\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.799189 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnzj\" (UniqueName: \"kubernetes.io/projected/34e59a30-14b8-4736-87b1-9d9581094598-kube-api-access-nrnzj\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.799334 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e59a30-14b8-4736-87b1-9d9581094598-memcached-tls-certs\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.799407 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34e59a30-14b8-4736-87b1-9d9581094598-config-data\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.900887 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34e59a30-14b8-4736-87b1-9d9581094598-config-data\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.900950 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e59a30-14b8-4736-87b1-9d9581094598-combined-ca-bundle\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.900971 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34e59a30-14b8-4736-87b1-9d9581094598-kolla-config\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.900991 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnzj\" (UniqueName: \"kubernetes.io/projected/34e59a30-14b8-4736-87b1-9d9581094598-kube-api-access-nrnzj\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.901070 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e59a30-14b8-4736-87b1-9d9581094598-memcached-tls-certs\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.904208 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34e59a30-14b8-4736-87b1-9d9581094598-config-data\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.904760 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34e59a30-14b8-4736-87b1-9d9581094598-kolla-config\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.907583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e59a30-14b8-4736-87b1-9d9581094598-memcached-tls-certs\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.926439 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnzj\" (UniqueName: \"kubernetes.io/projected/34e59a30-14b8-4736-87b1-9d9581094598-kube-api-access-nrnzj\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:16 crc kubenswrapper[4672]: I0930 12:37:16.927953 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e59a30-14b8-4736-87b1-9d9581094598-combined-ca-bundle\") pod \"memcached-0\" (UID: \"34e59a30-14b8-4736-87b1-9d9581094598\") " pod="openstack/memcached-0" Sep 30 12:37:17 crc kubenswrapper[4672]: I0930 12:37:17.052485 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.305757 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.307007 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.309450 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wpt6t" Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.323802 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt6cg\" (UniqueName: \"kubernetes.io/projected/8227da12-ad04-4956-bc6e-8bc6b49475a4-kube-api-access-lt6cg\") pod \"kube-state-metrics-0\" (UID: \"8227da12-ad04-4956-bc6e-8bc6b49475a4\") " pod="openstack/kube-state-metrics-0" Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.324692 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.425978 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt6cg\" (UniqueName: \"kubernetes.io/projected/8227da12-ad04-4956-bc6e-8bc6b49475a4-kube-api-access-lt6cg\") pod \"kube-state-metrics-0\" (UID: \"8227da12-ad04-4956-bc6e-8bc6b49475a4\") " pod="openstack/kube-state-metrics-0" Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.460131 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt6cg\" (UniqueName: \"kubernetes.io/projected/8227da12-ad04-4956-bc6e-8bc6b49475a4-kube-api-access-lt6cg\") pod \"kube-state-metrics-0\" (UID: \"8227da12-ad04-4956-bc6e-8bc6b49475a4\") " pod="openstack/kube-state-metrics-0" Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.620930 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" event={"ID":"148b676d-4484-44e1-8cb1-7136fdc07313","Type":"ContainerStarted","Data":"408dfe96632d473a62e39c01c512fd4c5bccc94cc2bd8232ba65720e39d74bab"} Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.626857 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" event={"ID":"9aae022e-aded-4f55-bfa2-9fa792516aac","Type":"ContainerStarted","Data":"420716377d921dfb9a49bb721ebd9d28486cd87830d8263676c46201f24f315c"} Sep 30 12:37:18 crc kubenswrapper[4672]: I0930 12:37:18.627890 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.612510 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.614968 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.617914 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.618127 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.630855 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.630884 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l8qzv" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.633750 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.669054 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.673998 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.753033 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq72c\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-kube-api-access-rq72c\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.753115 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/854a642c-e6c7-4859-8667-b64f9b54a872-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.753322 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.754071 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/854a642c-e6c7-4859-8667-b64f9b54a872-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.754610 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-config\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.754671 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.754740 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.754768 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856181 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-config\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856255 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856298 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856332 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq72c\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-kube-api-access-rq72c\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856355 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/854a642c-e6c7-4859-8667-b64f9b54a872-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856391 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.856409 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/854a642c-e6c7-4859-8667-b64f9b54a872-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.862387 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/854a642c-e6c7-4859-8667-b64f9b54a872-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.863157 4672 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.863190 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/13f5ba172743275de48f8b63cc56ba623f099037d0437073c4c58e3661633e39/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.864712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-config\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.865257 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/854a642c-e6c7-4859-8667-b64f9b54a872-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.866859 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.867227 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.872981 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.880182 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq72c\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-kube-api-access-rq72c\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.921566 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:19 crc kubenswrapper[4672]: I0930 12:37:19.953847 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.254452 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vhs7r"] Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.256739 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.260901 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.261081 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-gjgrm" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.261242 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.270356 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gb7pq"] Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.272374 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.307880 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vhs7r"] Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.332899 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.336625 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.341909 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fddmt" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.342059 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.342593 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.343002 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.343106 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.373963 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gb7pq"] Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.396678 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.407749 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f35be26-490e-49db-bd31-32ce35c84fab-scripts\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.407822 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltmvj\" (UniqueName: \"kubernetes.io/projected/71bceb54-c562-417a-8897-525930836f44-kube-api-access-ltmvj\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.407867 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f35be26-490e-49db-bd31-32ce35c84fab-ovn-controller-tls-certs\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.407899 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f35be26-490e-49db-bd31-32ce35c84fab-combined-ca-bundle\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.407943 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-etc-ovs\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.407966 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-run\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.408014 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-run-ovn\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.408047 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvbz\" (UniqueName: \"kubernetes.io/projected/9f35be26-490e-49db-bd31-32ce35c84fab-kube-api-access-spvbz\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.408074 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-log-ovn\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.408115 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-log\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.408163 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71bceb54-c562-417a-8897-525930836f44-scripts\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.408205 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-run\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.408235 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-lib\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509423 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-run-ovn\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509480 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spvbz\" (UniqueName: \"kubernetes.io/projected/9f35be26-490e-49db-bd31-32ce35c84fab-kube-api-access-spvbz\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509499 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-log-ovn\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509540 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-log\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509566 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509598 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71bceb54-c562-417a-8897-525930836f44-scripts\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509617 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfp2f\" (UniqueName: \"kubernetes.io/projected/d69b4d7c-be99-4405-aae9-8a11b85632b8-kube-api-access-dfp2f\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509645 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-run\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509660 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509680 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-lib\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509700 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f35be26-490e-49db-bd31-32ce35c84fab-scripts\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509719 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltmvj\" (UniqueName: \"kubernetes.io/projected/71bceb54-c562-417a-8897-525930836f44-kube-api-access-ltmvj\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509738 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69b4d7c-be99-4405-aae9-8a11b85632b8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509769 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f35be26-490e-49db-bd31-32ce35c84fab-ovn-controller-tls-certs\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509788 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d69b4d7c-be99-4405-aae9-8a11b85632b8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509804 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f35be26-490e-49db-bd31-32ce35c84fab-combined-ca-bundle\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509818 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509844 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509869 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-etc-ovs\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509886 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69b4d7c-be99-4405-aae9-8a11b85632b8-config\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509902 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-run\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509954 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-run-ovn\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.509990 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-log-ovn\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.510065 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-log\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.510105 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f35be26-490e-49db-bd31-32ce35c84fab-var-run\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.510721 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-lib\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.510791 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-var-run\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.511145 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71bceb54-c562-417a-8897-525930836f44-etc-ovs\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.512839 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71bceb54-c562-417a-8897-525930836f44-scripts\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.513000 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f35be26-490e-49db-bd31-32ce35c84fab-scripts\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.514712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f35be26-490e-49db-bd31-32ce35c84fab-combined-ca-bundle\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.514804 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f35be26-490e-49db-bd31-32ce35c84fab-ovn-controller-tls-certs\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.530069 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltmvj\" (UniqueName: \"kubernetes.io/projected/71bceb54-c562-417a-8897-525930836f44-kube-api-access-ltmvj\") pod \"ovn-controller-ovs-gb7pq\" (UID: \"71bceb54-c562-417a-8897-525930836f44\") " pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.534202 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvbz\" (UniqueName: \"kubernetes.io/projected/9f35be26-490e-49db-bd31-32ce35c84fab-kube-api-access-spvbz\") pod \"ovn-controller-vhs7r\" (UID: \"9f35be26-490e-49db-bd31-32ce35c84fab\") " pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.579630 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.602200 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.612539 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.612870 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfp2f\" (UniqueName: \"kubernetes.io/projected/d69b4d7c-be99-4405-aae9-8a11b85632b8-kube-api-access-dfp2f\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.613015 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.613175 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69b4d7c-be99-4405-aae9-8a11b85632b8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.613356 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d69b4d7c-be99-4405-aae9-8a11b85632b8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.613468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.613594 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.613747 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69b4d7c-be99-4405-aae9-8a11b85632b8-config\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.613993 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d69b4d7c-be99-4405-aae9-8a11b85632b8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.614411 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69b4d7c-be99-4405-aae9-8a11b85632b8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.614524 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69b4d7c-be99-4405-aae9-8a11b85632b8-config\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.614837 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.620000 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.625133 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.627960 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d69b4d7c-be99-4405-aae9-8a11b85632b8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.631732 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfp2f\" (UniqueName: \"kubernetes.io/projected/d69b4d7c-be99-4405-aae9-8a11b85632b8-kube-api-access-dfp2f\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.647823 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d69b4d7c-be99-4405-aae9-8a11b85632b8\") " pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:22 crc kubenswrapper[4672]: I0930 12:37:22.675108 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.763897 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.785856 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.790418 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.791002 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7dkft" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.791092 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.791483 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.791670 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876151 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876386 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876443 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-config\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876483 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876745 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876860 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876896 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.876915 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9m6k\" (UniqueName: \"kubernetes.io/projected/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-kube-api-access-d9m6k\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978224 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978325 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978348 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978363 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9m6k\" (UniqueName: \"kubernetes.io/projected/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-kube-api-access-d9m6k\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978393 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978431 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978618 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-config\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978645 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.978774 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.979712 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.980134 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-config\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.980280 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.985133 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.994006 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.994903 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:25 crc kubenswrapper[4672]: I0930 12:37:25.995452 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9m6k\" (UniqueName: \"kubernetes.io/projected/4e0bc671-11e7-442d-b5f3-4a901b0a0a80-kube-api-access-d9m6k\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:26 crc kubenswrapper[4672]: I0930 12:37:26.001122 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4e0bc671-11e7-442d-b5f3-4a901b0a0a80\") " pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:26 crc kubenswrapper[4672]: I0930 12:37:26.119964 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.095024 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.095688 4672 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.095834 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67kmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-698c778d7-shzg2_openstack(83f7f949-c461-4cd4-9aec-2cfd4507475e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.099509 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-698c778d7-shzg2" podUID="83f7f949-c461-4cd4-9aec-2cfd4507475e" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.107307 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.107369 4672 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.107499 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4dzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6dcdf6f545-4hkf4_openstack(976b5844-b276-466e-a961-99fd518cf5f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.108911 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" podUID="976b5844-b276-466e-a961-99fd518cf5f4" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.295456 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.295866 4672 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.296048 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.83:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llqhv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54d74754f7-6zpjc_openstack(fe558b41-786d-42ef-9de2-d0865c8afb44): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 12:37:32 crc kubenswrapper[4672]: E0930 12:37:32.298499 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" podUID="fe558b41-786d-42ef-9de2-d0865c8afb44" Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.785695 4672 generic.go:334] "Generic (PLEG): container finished" podID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerID="a0c26de2aaefa436e0b67d8d59a37473150e386f619e455836997bfb3399389f" exitCode=0 Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.785763 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" event={"ID":"9aae022e-aded-4f55-bfa2-9fa792516aac","Type":"ContainerDied","Data":"a0c26de2aaefa436e0b67d8d59a37473150e386f619e455836997bfb3399389f"} Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.789657 4672 generic.go:334] "Generic (PLEG): container finished" podID="148b676d-4484-44e1-8cb1-7136fdc07313" containerID="4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac" exitCode=0 Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.789704 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" event={"ID":"148b676d-4484-44e1-8cb1-7136fdc07313","Type":"ContainerDied","Data":"4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac"} Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.851485 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.873712 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.942442 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vhs7r"] Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.959774 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:37:32 crc kubenswrapper[4672]: I0930 12:37:32.987116 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.128821 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.381078 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.387479 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.391646 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.431650 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4dzg\" (UniqueName: \"kubernetes.io/projected/976b5844-b276-466e-a961-99fd518cf5f4-kube-api-access-z4dzg\") pod \"976b5844-b276-466e-a961-99fd518cf5f4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.431900 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-dns-svc\") pod \"976b5844-b276-466e-a961-99fd518cf5f4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.431936 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-config\") pod \"976b5844-b276-466e-a961-99fd518cf5f4\" (UID: \"976b5844-b276-466e-a961-99fd518cf5f4\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.432591 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-config" (OuterVolumeSpecName: "config") pod "976b5844-b276-466e-a961-99fd518cf5f4" (UID: "976b5844-b276-466e-a961-99fd518cf5f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.432569 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "976b5844-b276-466e-a961-99fd518cf5f4" (UID: "976b5844-b276-466e-a961-99fd518cf5f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.448405 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976b5844-b276-466e-a961-99fd518cf5f4-kube-api-access-z4dzg" (OuterVolumeSpecName: "kube-api-access-z4dzg") pod "976b5844-b276-466e-a961-99fd518cf5f4" (UID: "976b5844-b276-466e-a961-99fd518cf5f4"). InnerVolumeSpecName "kube-api-access-z4dzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.533394 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f7f949-c461-4cd4-9aec-2cfd4507475e-config\") pod \"83f7f949-c461-4cd4-9aec-2cfd4507475e\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.533456 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llqhv\" (UniqueName: \"kubernetes.io/projected/fe558b41-786d-42ef-9de2-d0865c8afb44-kube-api-access-llqhv\") pod \"fe558b41-786d-42ef-9de2-d0865c8afb44\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.533580 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-dns-svc\") pod \"fe558b41-786d-42ef-9de2-d0865c8afb44\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.533668 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67kmb\" (UniqueName: \"kubernetes.io/projected/83f7f949-c461-4cd4-9aec-2cfd4507475e-kube-api-access-67kmb\") pod \"83f7f949-c461-4cd4-9aec-2cfd4507475e\" (UID: \"83f7f949-c461-4cd4-9aec-2cfd4507475e\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.533693 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-config\") pod \"fe558b41-786d-42ef-9de2-d0865c8afb44\" (UID: \"fe558b41-786d-42ef-9de2-d0865c8afb44\") " Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.533916 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f7f949-c461-4cd4-9aec-2cfd4507475e-config" (OuterVolumeSpecName: "config") pod "83f7f949-c461-4cd4-9aec-2cfd4507475e" (UID: "83f7f949-c461-4cd4-9aec-2cfd4507475e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.534176 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f7f949-c461-4cd4-9aec-2cfd4507475e-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.534191 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.534200 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/976b5844-b276-466e-a961-99fd518cf5f4-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.534210 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4dzg\" (UniqueName: \"kubernetes.io/projected/976b5844-b276-466e-a961-99fd518cf5f4-kube-api-access-z4dzg\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.534952 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe558b41-786d-42ef-9de2-d0865c8afb44" (UID: "fe558b41-786d-42ef-9de2-d0865c8afb44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.535607 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-config" (OuterVolumeSpecName: "config") pod "fe558b41-786d-42ef-9de2-d0865c8afb44" (UID: "fe558b41-786d-42ef-9de2-d0865c8afb44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.538834 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe558b41-786d-42ef-9de2-d0865c8afb44-kube-api-access-llqhv" (OuterVolumeSpecName: "kube-api-access-llqhv") pod "fe558b41-786d-42ef-9de2-d0865c8afb44" (UID: "fe558b41-786d-42ef-9de2-d0865c8afb44"). InnerVolumeSpecName "kube-api-access-llqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.540573 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f7f949-c461-4cd4-9aec-2cfd4507475e-kube-api-access-67kmb" (OuterVolumeSpecName: "kube-api-access-67kmb") pod "83f7f949-c461-4cd4-9aec-2cfd4507475e" (UID: "83f7f949-c461-4cd4-9aec-2cfd4507475e"). InnerVolumeSpecName "kube-api-access-67kmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.645896 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llqhv\" (UniqueName: \"kubernetes.io/projected/fe558b41-786d-42ef-9de2-d0865c8afb44-kube-api-access-llqhv\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.646453 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.646464 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67kmb\" (UniqueName: \"kubernetes.io/projected/83f7f949-c461-4cd4-9aec-2cfd4507475e-kube-api-access-67kmb\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.646477 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe558b41-786d-42ef-9de2-d0865c8afb44-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.649855 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.656671 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 12:37:33 crc kubenswrapper[4672]: W0930 12:37:33.668932 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854a642c_e6c7_4859_8667_b64f9b54a872.slice/crio-93e5f067d880d9cb316fdf9e80464500acac020cb28941d3717a26711198a940 WatchSource:0}: Error finding container 93e5f067d880d9cb316fdf9e80464500acac020cb28941d3717a26711198a940: Status 404 returned error can't find the container with id 93e5f067d880d9cb316fdf9e80464500acac020cb28941d3717a26711198a940 Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.672011 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.748916 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 12:37:33 crc kubenswrapper[4672]: W0930 12:37:33.765097 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd69b4d7c_be99_4405_aae9_8a11b85632b8.slice/crio-f5cf11e46455afb9655f21ca14a25045c0fe1e88479794b656eb968094e5cc86 WatchSource:0}: Error finding container f5cf11e46455afb9655f21ca14a25045c0fe1e88479794b656eb968094e5cc86: Status 404 returned error can't find the container with id f5cf11e46455afb9655f21ca14a25045c0fe1e88479794b656eb968094e5cc86 Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.806163 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r" event={"ID":"9f35be26-490e-49db-bd31-32ce35c84fab","Type":"ContainerStarted","Data":"9a692dced28105cf991a84a805736f9f1a1f4404457e4c643dcec55f10fa2a2e"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.808826 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d795c24-8697-461f-9322-2c23bf7cb49b","Type":"ContainerStarted","Data":"b07965e30279b8eb5ec39eb632aba6618fb1387e02d40da9c8b1f2760d852e29"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.812378 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"056e0424-1faf-4d5a-8aea-e351214b3394","Type":"ContainerStarted","Data":"c287e9b9a9b153cfd19bb0620546cf884ecc1305d70fd1cafe0cdc4e4087a2b4"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.815213 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.815485 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcdf6f545-4hkf4" event={"ID":"976b5844-b276-466e-a961-99fd518cf5f4","Type":"ContainerDied","Data":"ce57317930d9dee76925981bd9f80bc2efed304a438f8ab87a7b76a06ee9b6b7"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.822370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8227da12-ad04-4956-bc6e-8bc6b49475a4","Type":"ContainerStarted","Data":"f7e81a5f7deebbf74d2f25fc4719169d6b25af9e4b802e956181f9d95885dfb4"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.824533 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" event={"ID":"fe558b41-786d-42ef-9de2-d0865c8afb44","Type":"ContainerDied","Data":"07ab8c6b5d4d1f2b22e4a19f0aa7ebe0b280f1bee0b537703872bd932dbf3c8c"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.824573 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54d74754f7-6zpjc" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.827370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" event={"ID":"148b676d-4484-44e1-8cb1-7136fdc07313","Type":"ContainerStarted","Data":"e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.827792 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.832020 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d165a3a8-6809-46e5-bd35-895200ab5bfc","Type":"ContainerStarted","Data":"095ffadce71570b86475085929bd32c07f4340e14817faefa02c4e0d311e5fb2"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.833722 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerStarted","Data":"93e5f067d880d9cb316fdf9e80464500acac020cb28941d3717a26711198a940"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.841612 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9159d76a-52b7-4262-a56a-ed28caec7f97","Type":"ContainerStarted","Data":"f45ba3f473f6b0cb00594e6574179f598349539ba635a27f0fb27c29ca465def"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.881583 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698c778d7-shzg2" event={"ID":"83f7f949-c461-4cd4-9aec-2cfd4507475e","Type":"ContainerDied","Data":"67766e95130f93f049eda10f0cfcc67bceba5db4f76236083412e556549f7f14"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.882847 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698c778d7-shzg2" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.884850 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aea18e8-190e-470a-9330-a30621c96afd","Type":"ContainerStarted","Data":"a9b919759cfc32816cae2bd00d0496ee044489c2d56cad837cf6b3c3546e1e8f"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.889287 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"34e59a30-14b8-4736-87b1-9d9581094598","Type":"ContainerStarted","Data":"ea338d9a9dd81d33eaf83c2d4527ac66110d2c4722ae28842ad6445f598bfc51"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.891013 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d69b4d7c-be99-4405-aae9-8a11b85632b8","Type":"ContainerStarted","Data":"f5cf11e46455afb9655f21ca14a25045c0fe1e88479794b656eb968094e5cc86"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.894226 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" event={"ID":"9aae022e-aded-4f55-bfa2-9fa792516aac","Type":"ContainerStarted","Data":"f7da0813cbfb0aed147e8014aaddea2aa24a882e5500ee46152703904bf5950f"} Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.896576 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.916713 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gb7pq"] Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.963990 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dcdf6f545-4hkf4"] Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.983375 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dcdf6f545-4hkf4"] Sep 30 12:37:33 crc kubenswrapper[4672]: I0930 12:37:33.986720 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" podStartSLOduration=8.680497253 podStartE2EDuration="22.986702772s" podCreationTimestamp="2025-09-30 12:37:11 +0000 UTC" firstStartedPulling="2025-09-30 12:37:18.063673695 +0000 UTC m=+929.332911341" lastFinishedPulling="2025-09-30 12:37:32.369879214 +0000 UTC m=+943.639116860" observedRunningTime="2025-09-30 12:37:33.874117879 +0000 UTC m=+945.143355525" watchObservedRunningTime="2025-09-30 12:37:33.986702772 +0000 UTC m=+945.255940418" Sep 30 12:37:34 crc kubenswrapper[4672]: I0930 12:37:34.017853 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54d74754f7-6zpjc"] Sep 30 12:37:34 crc kubenswrapper[4672]: I0930 12:37:34.026828 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54d74754f7-6zpjc"] Sep 30 12:37:34 crc kubenswrapper[4672]: I0930 12:37:34.040588 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698c778d7-shzg2"] Sep 30 12:37:34 crc kubenswrapper[4672]: I0930 12:37:34.049358 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698c778d7-shzg2"] Sep 30 12:37:34 crc kubenswrapper[4672]: I0930 12:37:34.050822 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" podStartSLOduration=8.750358877 podStartE2EDuration="23.050810991s" podCreationTimestamp="2025-09-30 12:37:11 +0000 UTC" firstStartedPulling="2025-09-30 12:37:18.064053244 +0000 UTC m=+929.333290880" lastFinishedPulling="2025-09-30 12:37:32.364505338 +0000 UTC m=+943.633742994" observedRunningTime="2025-09-30 12:37:33.977647003 +0000 UTC m=+945.246884659" watchObservedRunningTime="2025-09-30 12:37:34.050810991 +0000 UTC m=+945.320048637" Sep 30 12:37:34 crc kubenswrapper[4672]: I0930 12:37:34.785162 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 12:37:34 crc kubenswrapper[4672]: I0930 12:37:34.906011 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gb7pq" event={"ID":"71bceb54-c562-417a-8897-525930836f44","Type":"ContainerStarted","Data":"15b06e5adf08f1df79fb29b2a268904555d61563042371e6d3986f7d40809f8c"} Sep 30 12:37:35 crc kubenswrapper[4672]: I0930 12:37:35.430211 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f7f949-c461-4cd4-9aec-2cfd4507475e" path="/var/lib/kubelet/pods/83f7f949-c461-4cd4-9aec-2cfd4507475e/volumes" Sep 30 12:37:35 crc kubenswrapper[4672]: I0930 12:37:35.430982 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="976b5844-b276-466e-a961-99fd518cf5f4" path="/var/lib/kubelet/pods/976b5844-b276-466e-a961-99fd518cf5f4/volumes" Sep 30 12:37:35 crc kubenswrapper[4672]: I0930 12:37:35.431697 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe558b41-786d-42ef-9de2-d0865c8afb44" path="/var/lib/kubelet/pods/fe558b41-786d-42ef-9de2-d0865c8afb44/volumes" Sep 30 12:37:37 crc kubenswrapper[4672]: W0930 12:37:37.205522 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e0bc671_11e7_442d_b5f3_4a901b0a0a80.slice/crio-6370bd51a6076ae15aadd4ae052bdf69a2d2f92842e124fb17252ce61d3e3327 WatchSource:0}: Error finding container 6370bd51a6076ae15aadd4ae052bdf69a2d2f92842e124fb17252ce61d3e3327: Status 404 returned error can't find the container with id 6370bd51a6076ae15aadd4ae052bdf69a2d2f92842e124fb17252ce61d3e3327 Sep 30 12:37:37 crc kubenswrapper[4672]: I0930 12:37:37.933674 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4e0bc671-11e7-442d-b5f3-4a901b0a0a80","Type":"ContainerStarted","Data":"6370bd51a6076ae15aadd4ae052bdf69a2d2f92842e124fb17252ce61d3e3327"} Sep 30 12:37:40 crc kubenswrapper[4672]: I0930 12:37:40.959481 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aea18e8-190e-470a-9330-a30621c96afd","Type":"ContainerStarted","Data":"e01b88bbf1911514ef8474023e787f93d998277ea95c8dfcfe001656a8fbed44"} Sep 30 12:37:41 crc kubenswrapper[4672]: I0930 12:37:41.778429 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:41 crc kubenswrapper[4672]: I0930 12:37:41.983447 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"056e0424-1faf-4d5a-8aea-e351214b3394","Type":"ContainerStarted","Data":"ed7fa277c2c178542bfb2184ed2b8d694c0abd0f10f53c8b02ca5676a97b124a"} Sep 30 12:37:41 crc kubenswrapper[4672]: I0930 12:37:41.988668 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9159d76a-52b7-4262-a56a-ed28caec7f97","Type":"ContainerStarted","Data":"83fd8133010ce3de2e6f2de7a381915acdefcd6cfa1fbf38dea11878e5e7d68b"} Sep 30 12:37:41 crc kubenswrapper[4672]: I0930 12:37:41.990865 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r" event={"ID":"9f35be26-490e-49db-bd31-32ce35c84fab","Type":"ContainerStarted","Data":"9e26a72a8d30da75289a67ad6fb6b4cf1d9f9121baca65c166baf9914a63c360"} Sep 30 12:37:41 crc kubenswrapper[4672]: I0930 12:37:41.991476 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vhs7r" Sep 30 12:37:41 crc kubenswrapper[4672]: I0930 12:37:41.992742 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d165a3a8-6809-46e5-bd35-895200ab5bfc","Type":"ContainerStarted","Data":"52bb7904fb5094e742f944f6757153dc7d5299b545fb1e4f7fecd172fadc25cf"} Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.000568 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d69b4d7c-be99-4405-aae9-8a11b85632b8","Type":"ContainerStarted","Data":"c92b0d199d7002cef769c3cc455551ec2f51cee34c3a77277d1530f54eee9ba0"} Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.002393 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4e0bc671-11e7-442d-b5f3-4a901b0a0a80","Type":"ContainerStarted","Data":"b7062d53e194dabbb8d5d03f2f77275d3bdbac26e578f17f9ef64448d033bdae"} Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.004599 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"34e59a30-14b8-4736-87b1-9d9581094598","Type":"ContainerStarted","Data":"55f6fb185fb8ab2aeb7d78b4784399e2b2bd89680b01c03c6eaf8e6c51379435"} Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.004628 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.032710 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.286958688 podStartE2EDuration="26.032694845s" podCreationTimestamp="2025-09-30 12:37:16 +0000 UTC" firstStartedPulling="2025-09-30 12:37:32.942131735 +0000 UTC m=+944.211369381" lastFinishedPulling="2025-09-30 12:37:39.687867902 +0000 UTC m=+950.957105538" observedRunningTime="2025-09-30 12:37:42.029682269 +0000 UTC m=+953.298919925" watchObservedRunningTime="2025-09-30 12:37:42.032694845 +0000 UTC m=+953.301932491" Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.126882 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.158540 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vhs7r" podStartSLOduration=13.058601251 podStartE2EDuration="20.158513953s" podCreationTimestamp="2025-09-30 12:37:22 +0000 UTC" firstStartedPulling="2025-09-30 12:37:32.976195595 +0000 UTC m=+944.245433241" lastFinishedPulling="2025-09-30 12:37:40.076108287 +0000 UTC m=+951.345345943" observedRunningTime="2025-09-30 12:37:42.09704492 +0000 UTC m=+953.366282566" watchObservedRunningTime="2025-09-30 12:37:42.158513953 +0000 UTC m=+953.427751609" Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.185764 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fffdb5d5-mx5lk"] Sep 30 12:37:42 crc kubenswrapper[4672]: I0930 12:37:42.186111 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" podUID="148b676d-4484-44e1-8cb1-7136fdc07313" containerName="dnsmasq-dns" containerID="cri-o://e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2" gracePeriod=10 Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.015503 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.015853 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d795c24-8697-461f-9322-2c23bf7cb49b","Type":"ContainerStarted","Data":"55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9"} Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.018047 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gb7pq" event={"ID":"71bceb54-c562-417a-8897-525930836f44","Type":"ContainerStarted","Data":"1a4210de50b3eae549be46f74f0d34d4bc18598176775bfad43786e3325013db"} Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.021438 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8227da12-ad04-4956-bc6e-8bc6b49475a4","Type":"ContainerStarted","Data":"095606d69855d83f14140aae3133c63d23ffb62d9965c96fd951803a6a876cb9"} Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.021498 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.023827 4672 generic.go:334] "Generic (PLEG): container finished" podID="148b676d-4484-44e1-8cb1-7136fdc07313" containerID="e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2" exitCode=0 Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.023899 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.023984 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" event={"ID":"148b676d-4484-44e1-8cb1-7136fdc07313","Type":"ContainerDied","Data":"e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2"} Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.024074 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fffdb5d5-mx5lk" event={"ID":"148b676d-4484-44e1-8cb1-7136fdc07313","Type":"ContainerDied","Data":"408dfe96632d473a62e39c01c512fd4c5bccc94cc2bd8232ba65720e39d74bab"} Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.024106 4672 scope.go:117] "RemoveContainer" containerID="e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.101518 4672 scope.go:117] "RemoveContainer" containerID="4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.122241 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.45531927 podStartE2EDuration="25.122173138s" podCreationTimestamp="2025-09-30 12:37:18 +0000 UTC" firstStartedPulling="2025-09-30 12:37:33.125518016 +0000 UTC m=+944.394755662" lastFinishedPulling="2025-09-30 12:37:40.792371884 +0000 UTC m=+952.061609530" observedRunningTime="2025-09-30 12:37:43.116069314 +0000 UTC m=+954.385306970" watchObservedRunningTime="2025-09-30 12:37:43.122173138 +0000 UTC m=+954.391410804" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.144382 4672 scope.go:117] "RemoveContainer" containerID="e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2" Sep 30 12:37:43 crc kubenswrapper[4672]: E0930 12:37:43.144841 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2\": container with ID starting with e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2 not found: ID does not exist" containerID="e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.144893 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2"} err="failed to get container status \"e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2\": rpc error: code = NotFound desc = could not find container \"e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2\": container with ID starting with e103cc7305ad9a403b14ade6ff8c125bea89ce98a8b716662a08c09e7494a2f2 not found: ID does not exist" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.144920 4672 scope.go:117] "RemoveContainer" containerID="4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac" Sep 30 12:37:43 crc kubenswrapper[4672]: E0930 12:37:43.145299 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac\": container with ID starting with 4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac not found: ID does not exist" containerID="4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.145344 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac"} err="failed to get container status \"4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac\": rpc error: code = NotFound desc = could not find container \"4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac\": container with ID starting with 4bf3e24698e4bb343d7f18b55879c69c71e61aa556b76af57046f068916b2aac not found: ID does not exist" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.156215 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n27lk\" (UniqueName: \"kubernetes.io/projected/148b676d-4484-44e1-8cb1-7136fdc07313-kube-api-access-n27lk\") pod \"148b676d-4484-44e1-8cb1-7136fdc07313\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.156445 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-config\") pod \"148b676d-4484-44e1-8cb1-7136fdc07313\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.156467 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-dns-svc\") pod \"148b676d-4484-44e1-8cb1-7136fdc07313\" (UID: \"148b676d-4484-44e1-8cb1-7136fdc07313\") " Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.237832 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148b676d-4484-44e1-8cb1-7136fdc07313-kube-api-access-n27lk" (OuterVolumeSpecName: "kube-api-access-n27lk") pod "148b676d-4484-44e1-8cb1-7136fdc07313" (UID: "148b676d-4484-44e1-8cb1-7136fdc07313"). InnerVolumeSpecName "kube-api-access-n27lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.258810 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n27lk\" (UniqueName: \"kubernetes.io/projected/148b676d-4484-44e1-8cb1-7136fdc07313-kube-api-access-n27lk\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.351002 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-config" (OuterVolumeSpecName: "config") pod "148b676d-4484-44e1-8cb1-7136fdc07313" (UID: "148b676d-4484-44e1-8cb1-7136fdc07313"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.352756 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "148b676d-4484-44e1-8cb1-7136fdc07313" (UID: "148b676d-4484-44e1-8cb1-7136fdc07313"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.360163 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.360189 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/148b676d-4484-44e1-8cb1-7136fdc07313-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.646119 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fffdb5d5-mx5lk"] Sep 30 12:37:43 crc kubenswrapper[4672]: I0930 12:37:43.656427 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fffdb5d5-mx5lk"] Sep 30 12:37:44 crc kubenswrapper[4672]: I0930 12:37:44.033975 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerStarted","Data":"baa06c426eba60d45465ff8502ff26b5d4b4ca2356b7db98bc5977e364fb8c66"} Sep 30 12:37:44 crc kubenswrapper[4672]: I0930 12:37:44.036797 4672 generic.go:334] "Generic (PLEG): container finished" podID="71bceb54-c562-417a-8897-525930836f44" containerID="1a4210de50b3eae549be46f74f0d34d4bc18598176775bfad43786e3325013db" exitCode=0 Sep 30 12:37:44 crc kubenswrapper[4672]: I0930 12:37:44.036915 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gb7pq" event={"ID":"71bceb54-c562-417a-8897-525930836f44","Type":"ContainerDied","Data":"1a4210de50b3eae549be46f74f0d34d4bc18598176775bfad43786e3325013db"} Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.428008 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148b676d-4484-44e1-8cb1-7136fdc07313" path="/var/lib/kubelet/pods/148b676d-4484-44e1-8cb1-7136fdc07313/volumes" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.815059 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-gln7l"] Sep 30 12:37:45 crc kubenswrapper[4672]: E0930 12:37:45.816096 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148b676d-4484-44e1-8cb1-7136fdc07313" containerName="init" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.816123 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="148b676d-4484-44e1-8cb1-7136fdc07313" containerName="init" Sep 30 12:37:45 crc kubenswrapper[4672]: E0930 12:37:45.816173 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148b676d-4484-44e1-8cb1-7136fdc07313" containerName="dnsmasq-dns" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.816180 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="148b676d-4484-44e1-8cb1-7136fdc07313" containerName="dnsmasq-dns" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.817559 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="148b676d-4484-44e1-8cb1-7136fdc07313" containerName="dnsmasq-dns" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.818500 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.820576 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.833016 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gln7l"] Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.901356 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c1379-e576-40be-a37a-1d73f84cab81-combined-ca-bundle\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.901446 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4q8m\" (UniqueName: \"kubernetes.io/projected/f11c1379-e576-40be-a37a-1d73f84cab81-kube-api-access-t4q8m\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.901485 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11c1379-e576-40be-a37a-1d73f84cab81-config\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.901515 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f11c1379-e576-40be-a37a-1d73f84cab81-ovs-rundir\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.901541 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f11c1379-e576-40be-a37a-1d73f84cab81-ovn-rundir\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.901560 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c1379-e576-40be-a37a-1d73f84cab81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.957598 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56cdb6c897-cjdnr"] Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.958811 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.964525 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 12:37:45 crc kubenswrapper[4672]: I0930 12:37:45.975146 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56cdb6c897-cjdnr"] Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.008215 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4q8m\" (UniqueName: \"kubernetes.io/projected/f11c1379-e576-40be-a37a-1d73f84cab81-kube-api-access-t4q8m\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.010872 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-dns-svc\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011003 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11c1379-e576-40be-a37a-1d73f84cab81-config\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011089 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f11c1379-e576-40be-a37a-1d73f84cab81-ovs-rundir\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011144 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f11c1379-e576-40be-a37a-1d73f84cab81-ovn-rundir\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011187 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c1379-e576-40be-a37a-1d73f84cab81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011316 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c1379-e576-40be-a37a-1d73f84cab81-combined-ca-bundle\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011382 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-ovsdbserver-nb\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011414 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92ql\" (UniqueName: \"kubernetes.io/projected/27236a15-64c2-46be-9186-02c66426c933-kube-api-access-m92ql\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.011444 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-config\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.012406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11c1379-e576-40be-a37a-1d73f84cab81-config\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.013336 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f11c1379-e576-40be-a37a-1d73f84cab81-ovn-rundir\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.023721 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f11c1379-e576-40be-a37a-1d73f84cab81-ovs-rundir\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.027632 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11c1379-e576-40be-a37a-1d73f84cab81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.037507 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4q8m\" (UniqueName: \"kubernetes.io/projected/f11c1379-e576-40be-a37a-1d73f84cab81-kube-api-access-t4q8m\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.051032 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11c1379-e576-40be-a37a-1d73f84cab81-combined-ca-bundle\") pod \"ovn-controller-metrics-gln7l\" (UID: \"f11c1379-e576-40be-a37a-1d73f84cab81\") " pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.062392 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4e0bc671-11e7-442d-b5f3-4a901b0a0a80","Type":"ContainerStarted","Data":"0e31fde614af218df0894cad43320c0787a54b9d820352a639764bf996c0e7d2"} Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.065195 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gb7pq" event={"ID":"71bceb54-c562-417a-8897-525930836f44","Type":"ContainerStarted","Data":"c11526f5083104e21f55ffd9b9b3e86c161444c1af27a2f20a8c520a06235e35"} Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.065243 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gb7pq" event={"ID":"71bceb54-c562-417a-8897-525930836f44","Type":"ContainerStarted","Data":"52ef1aea5ff356f574c37c389a1dd340708868e63776c628d7e071aad16671e8"} Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.065465 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.065505 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.084555 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d69b4d7c-be99-4405-aae9-8a11b85632b8","Type":"ContainerStarted","Data":"8481cbb7cf330bf336000f985fbc56b27d51a3c05e318e32c6d0bb001732c759"} Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.098718 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.046957546 podStartE2EDuration="22.098694333s" podCreationTimestamp="2025-09-30 12:37:24 +0000 UTC" firstStartedPulling="2025-09-30 12:37:37.208658586 +0000 UTC m=+948.477896272" lastFinishedPulling="2025-09-30 12:37:45.260395413 +0000 UTC m=+956.529633059" observedRunningTime="2025-09-30 12:37:46.09659053 +0000 UTC m=+957.365828176" watchObservedRunningTime="2025-09-30 12:37:46.098694333 +0000 UTC m=+957.367931979" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.115206 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-ovsdbserver-nb\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.115402 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92ql\" (UniqueName: \"kubernetes.io/projected/27236a15-64c2-46be-9186-02c66426c933-kube-api-access-m92ql\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.115436 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-config\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.115525 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-dns-svc\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.117033 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-config\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.117665 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-ovsdbserver-nb\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.119644 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-dns-svc\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.124231 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.126453 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.61701931 podStartE2EDuration="25.126430903s" podCreationTimestamp="2025-09-30 12:37:21 +0000 UTC" firstStartedPulling="2025-09-30 12:37:33.772120424 +0000 UTC m=+945.041358060" lastFinishedPulling="2025-09-30 12:37:45.281532007 +0000 UTC m=+956.550769653" observedRunningTime="2025-09-30 12:37:46.122256988 +0000 UTC m=+957.391494634" watchObservedRunningTime="2025-09-30 12:37:46.126430903 +0000 UTC m=+957.395668549" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.135952 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gln7l" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.149951 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92ql\" (UniqueName: \"kubernetes.io/projected/27236a15-64c2-46be-9186-02c66426c933-kube-api-access-m92ql\") pod \"dnsmasq-dns-56cdb6c897-cjdnr\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.164018 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gb7pq" podStartSLOduration=18.129537006 podStartE2EDuration="24.163955731s" podCreationTimestamp="2025-09-30 12:37:22 +0000 UTC" firstStartedPulling="2025-09-30 12:37:33.889380525 +0000 UTC m=+945.158618171" lastFinishedPulling="2025-09-30 12:37:39.92379924 +0000 UTC m=+951.193036896" observedRunningTime="2025-09-30 12:37:46.158690898 +0000 UTC m=+957.427928554" watchObservedRunningTime="2025-09-30 12:37:46.163955731 +0000 UTC m=+957.433193377" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.215944 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cdb6c897-cjdnr"] Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.216696 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.266681 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7555fbdd8f-68hzf"] Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.269098 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.288000 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7555fbdd8f-68hzf"] Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.288250 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.421458 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkh77\" (UniqueName: \"kubernetes.io/projected/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-kube-api-access-lkh77\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.422130 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-dns-svc\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.422176 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-nb\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.422208 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-config\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.422233 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-sb\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.525811 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkh77\" (UniqueName: \"kubernetes.io/projected/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-kube-api-access-lkh77\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.525955 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-dns-svc\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.526002 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-nb\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.526057 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-config\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.526081 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-sb\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.527969 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-dns-svc\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.528855 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-nb\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.529408 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-config\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.529784 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-sb\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.551307 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkh77\" (UniqueName: \"kubernetes.io/projected/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-kube-api-access-lkh77\") pod \"dnsmasq-dns-7555fbdd8f-68hzf\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.670546 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.676443 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.739387 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.826893 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gln7l"] Sep 30 12:37:46 crc kubenswrapper[4672]: I0930 12:37:46.955158 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cdb6c897-cjdnr"] Sep 30 12:37:46 crc kubenswrapper[4672]: W0930 12:37:46.958354 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27236a15_64c2_46be_9186_02c66426c933.slice/crio-d3d41cfe5e9fff589a8a9ad00e5b0e7e3b40c6680eb2b8d10296c8ce63128e16 WatchSource:0}: Error finding container d3d41cfe5e9fff589a8a9ad00e5b0e7e3b40c6680eb2b8d10296c8ce63128e16: Status 404 returned error can't find the container with id d3d41cfe5e9fff589a8a9ad00e5b0e7e3b40c6680eb2b8d10296c8ce63128e16 Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.056293 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.112607 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" event={"ID":"27236a15-64c2-46be-9186-02c66426c933","Type":"ContainerStarted","Data":"d3d41cfe5e9fff589a8a9ad00e5b0e7e3b40c6680eb2b8d10296c8ce63128e16"} Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.114199 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gln7l" event={"ID":"f11c1379-e576-40be-a37a-1d73f84cab81","Type":"ContainerStarted","Data":"e0cc404e6495c1a5d3bbc142a395063406d61810b2881e9579d3a55dafceb0d9"} Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.115088 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.124923 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.181470 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-gln7l" podStartSLOduration=2.181441526 podStartE2EDuration="2.181441526s" podCreationTimestamp="2025-09-30 12:37:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:37:47.147090198 +0000 UTC m=+958.416327844" watchObservedRunningTime="2025-09-30 12:37:47.181441526 +0000 UTC m=+958.450679202" Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.184289 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7555fbdd8f-68hzf"] Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.206932 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 12:37:47 crc kubenswrapper[4672]: I0930 12:37:47.234359 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.122164 4672 generic.go:334] "Generic (PLEG): container finished" podID="9159d76a-52b7-4262-a56a-ed28caec7f97" containerID="83fd8133010ce3de2e6f2de7a381915acdefcd6cfa1fbf38dea11878e5e7d68b" exitCode=0 Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.122236 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9159d76a-52b7-4262-a56a-ed28caec7f97","Type":"ContainerDied","Data":"83fd8133010ce3de2e6f2de7a381915acdefcd6cfa1fbf38dea11878e5e7d68b"} Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.125075 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gln7l" event={"ID":"f11c1379-e576-40be-a37a-1d73f84cab81","Type":"ContainerStarted","Data":"332b3f2db936ba9bed778a4c2f13fff6aff9ceca2a9a676e0277fd3237138b59"} Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.127371 4672 generic.go:334] "Generic (PLEG): container finished" podID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerID="f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1" exitCode=0 Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.127406 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" event={"ID":"cf8b16b6-06e4-43b1-b72e-bb882ff5da17","Type":"ContainerDied","Data":"f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1"} Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.127603 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" event={"ID":"cf8b16b6-06e4-43b1-b72e-bb882ff5da17","Type":"ContainerStarted","Data":"2d1edd5a96ecbd49dd5b374b57e59e65979dd0d7733dea1ebe36a32e6cdd5a16"} Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.129545 4672 generic.go:334] "Generic (PLEG): container finished" podID="27236a15-64c2-46be-9186-02c66426c933" containerID="ae6223bd71037970f1f60d2a9351aacaa6656e217daebd469b8c18ade78600b8" exitCode=0 Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.129636 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" event={"ID":"27236a15-64c2-46be-9186-02c66426c933","Type":"ContainerDied","Data":"ae6223bd71037970f1f60d2a9351aacaa6656e217daebd469b8c18ade78600b8"} Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.193139 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.347722 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.349114 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.352465 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.352711 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.358744 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8jdjw" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.358872 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.378653 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.467394 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d3e20-6df1-45be-bebd-b7e990e0aa5f-config\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.467473 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.467511 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.467585 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.467681 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpppv\" (UniqueName: \"kubernetes.io/projected/879d3e20-6df1-45be-bebd-b7e990e0aa5f-kube-api-access-fpppv\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.468681 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/879d3e20-6df1-45be-bebd-b7e990e0aa5f-scripts\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.469243 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/879d3e20-6df1-45be-bebd-b7e990e0aa5f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.522310 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576103 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-config\") pod \"27236a15-64c2-46be-9186-02c66426c933\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576187 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m92ql\" (UniqueName: \"kubernetes.io/projected/27236a15-64c2-46be-9186-02c66426c933-kube-api-access-m92ql\") pod \"27236a15-64c2-46be-9186-02c66426c933\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576230 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-ovsdbserver-nb\") pod \"27236a15-64c2-46be-9186-02c66426c933\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576331 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-dns-svc\") pod \"27236a15-64c2-46be-9186-02c66426c933\" (UID: \"27236a15-64c2-46be-9186-02c66426c933\") " Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576638 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/879d3e20-6df1-45be-bebd-b7e990e0aa5f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576675 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d3e20-6df1-45be-bebd-b7e990e0aa5f-config\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576699 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576728 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576768 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576822 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpppv\" (UniqueName: \"kubernetes.io/projected/879d3e20-6df1-45be-bebd-b7e990e0aa5f-kube-api-access-fpppv\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.576857 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/879d3e20-6df1-45be-bebd-b7e990e0aa5f-scripts\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.579590 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/879d3e20-6df1-45be-bebd-b7e990e0aa5f-scripts\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.582450 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d3e20-6df1-45be-bebd-b7e990e0aa5f-config\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.585873 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/879d3e20-6df1-45be-bebd-b7e990e0aa5f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.591741 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27236a15-64c2-46be-9186-02c66426c933-kube-api-access-m92ql" (OuterVolumeSpecName: "kube-api-access-m92ql") pod "27236a15-64c2-46be-9186-02c66426c933" (UID: "27236a15-64c2-46be-9186-02c66426c933"). InnerVolumeSpecName "kube-api-access-m92ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.597100 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.597405 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.598469 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/879d3e20-6df1-45be-bebd-b7e990e0aa5f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.607290 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpppv\" (UniqueName: \"kubernetes.io/projected/879d3e20-6df1-45be-bebd-b7e990e0aa5f-kube-api-access-fpppv\") pod \"ovn-northd-0\" (UID: \"879d3e20-6df1-45be-bebd-b7e990e0aa5f\") " pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.619935 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-config" (OuterVolumeSpecName: "config") pod "27236a15-64c2-46be-9186-02c66426c933" (UID: "27236a15-64c2-46be-9186-02c66426c933"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.643512 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27236a15-64c2-46be-9186-02c66426c933" (UID: "27236a15-64c2-46be-9186-02c66426c933"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.643965 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27236a15-64c2-46be-9186-02c66426c933" (UID: "27236a15-64c2-46be-9186-02c66426c933"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.659776 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7555fbdd8f-68hzf"] Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.663763 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.678781 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.678819 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.678829 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m92ql\" (UniqueName: \"kubernetes.io/projected/27236a15-64c2-46be-9186-02c66426c933-kube-api-access-m92ql\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.678843 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27236a15-64c2-46be-9186-02c66426c933-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.697767 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdf9c6749-ssvf6"] Sep 30 12:37:48 crc kubenswrapper[4672]: E0930 12:37:48.698216 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27236a15-64c2-46be-9186-02c66426c933" containerName="init" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.698244 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="27236a15-64c2-46be-9186-02c66426c933" containerName="init" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.698473 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="27236a15-64c2-46be-9186-02c66426c933" containerName="init" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.699623 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.705220 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.710197 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdf9c6749-ssvf6"] Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.780833 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-config\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.780875 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz5pj\" (UniqueName: \"kubernetes.io/projected/9e623096-32ac-4373-b477-1a8bfcc4a137-kube-api-access-vz5pj\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.780904 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-nb\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.780951 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-sb\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.781171 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-dns-svc\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.882228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-config\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.882280 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz5pj\" (UniqueName: \"kubernetes.io/projected/9e623096-32ac-4373-b477-1a8bfcc4a137-kube-api-access-vz5pj\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.882306 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-nb\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.882347 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-sb\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.882409 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-dns-svc\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.883371 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-config\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.883377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-dns-svc\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.883909 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-nb\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.884460 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-sb\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:48 crc kubenswrapper[4672]: I0930 12:37:48.900741 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz5pj\" (UniqueName: \"kubernetes.io/projected/9e623096-32ac-4373-b477-1a8bfcc4a137-kube-api-access-vz5pj\") pod \"dnsmasq-dns-fdf9c6749-ssvf6\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.036348 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.141251 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" event={"ID":"cf8b16b6-06e4-43b1-b72e-bb882ff5da17","Type":"ContainerStarted","Data":"2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52"} Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.141875 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.149336 4672 generic.go:334] "Generic (PLEG): container finished" podID="056e0424-1faf-4d5a-8aea-e351214b3394" containerID="ed7fa277c2c178542bfb2184ed2b8d694c0abd0f10f53c8b02ca5676a97b124a" exitCode=0 Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.149411 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"056e0424-1faf-4d5a-8aea-e351214b3394","Type":"ContainerDied","Data":"ed7fa277c2c178542bfb2184ed2b8d694c0abd0f10f53c8b02ca5676a97b124a"} Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.155083 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.155306 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cdb6c897-cjdnr" event={"ID":"27236a15-64c2-46be-9186-02c66426c933","Type":"ContainerDied","Data":"d3d41cfe5e9fff589a8a9ad00e5b0e7e3b40c6680eb2b8d10296c8ce63128e16"} Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.155370 4672 scope.go:117] "RemoveContainer" containerID="ae6223bd71037970f1f60d2a9351aacaa6656e217daebd469b8c18ade78600b8" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.175153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9159d76a-52b7-4262-a56a-ed28caec7f97","Type":"ContainerStarted","Data":"c1f1aa8d34c5eace0d957c227d0c9ca56d0e165694a251f0a884786184b1931a"} Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.193176 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" podStartSLOduration=3.193149976 podStartE2EDuration="3.193149976s" podCreationTimestamp="2025-09-30 12:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:37:49.165323903 +0000 UTC m=+960.434561569" watchObservedRunningTime="2025-09-30 12:37:49.193149976 +0000 UTC m=+960.462387622" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.261108 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.013331466 podStartE2EDuration="34.261090642s" podCreationTimestamp="2025-09-30 12:37:15 +0000 UTC" firstStartedPulling="2025-09-30 12:37:32.941516039 +0000 UTC m=+944.210753685" lastFinishedPulling="2025-09-30 12:37:40.189275215 +0000 UTC m=+951.458512861" observedRunningTime="2025-09-30 12:37:49.228377066 +0000 UTC m=+960.497614722" watchObservedRunningTime="2025-09-30 12:37:49.261090642 +0000 UTC m=+960.530328288" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.295048 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 12:37:49 crc kubenswrapper[4672]: W0930 12:37:49.300074 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod879d3e20_6df1_45be_bebd_b7e990e0aa5f.slice/crio-caed46f3492af0ea073b1ebfa689d90a6974f9be12e47dc25331f7f42fa198b2 WatchSource:0}: Error finding container caed46f3492af0ea073b1ebfa689d90a6974f9be12e47dc25331f7f42fa198b2: Status 404 returned error can't find the container with id caed46f3492af0ea073b1ebfa689d90a6974f9be12e47dc25331f7f42fa198b2 Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.461109 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cdb6c897-cjdnr"] Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.478440 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56cdb6c897-cjdnr"] Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.592445 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdf9c6749-ssvf6"] Sep 30 12:37:49 crc kubenswrapper[4672]: W0930 12:37:49.597400 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e623096_32ac_4373_b477_1a8bfcc4a137.slice/crio-e5a7e1456e8ddf4a5f40a86906e1675e3497cae23fe4180d0bce9b911bd2ec43 WatchSource:0}: Error finding container e5a7e1456e8ddf4a5f40a86906e1675e3497cae23fe4180d0bce9b911bd2ec43: Status 404 returned error can't find the container with id e5a7e1456e8ddf4a5f40a86906e1675e3497cae23fe4180d0bce9b911bd2ec43 Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.816498 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.824175 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.831147 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.831147 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.831357 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.833670 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bqtsw" Sep 30 12:37:49 crc kubenswrapper[4672]: I0930 12:37:49.843021 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.009065 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.009381 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.009779 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3cc34662-d100-4436-9067-c615b7b3f83f-lock\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.009964 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3cc34662-d100-4436-9067-c615b7b3f83f-cache\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.010089 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdln\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-kube-api-access-xkdln\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.112102 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.112478 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.112516 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3cc34662-d100-4436-9067-c615b7b3f83f-lock\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.112540 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3cc34662-d100-4436-9067-c615b7b3f83f-cache\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.112528 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.112576 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdln\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-kube-api-access-xkdln\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.112985 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3cc34662-d100-4436-9067-c615b7b3f83f-lock\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.113083 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3cc34662-d100-4436-9067-c615b7b3f83f-cache\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: E0930 12:37:50.113145 4672 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 12:37:50 crc kubenswrapper[4672]: E0930 12:37:50.113161 4672 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 12:37:50 crc kubenswrapper[4672]: E0930 12:37:50.113235 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift podName:3cc34662-d100-4436-9067-c615b7b3f83f nodeName:}" failed. No retries permitted until 2025-09-30 12:37:50.613214211 +0000 UTC m=+961.882451927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift") pod "swift-storage-0" (UID: "3cc34662-d100-4436-9067-c615b7b3f83f") : configmap "swift-ring-files" not found Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.166098 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdln\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-kube-api-access-xkdln\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.174604 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.253447 4672 generic.go:334] "Generic (PLEG): container finished" podID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerID="fc9c2942b2821de4568976774fe67fcccb9bec8b576f74232c26a558ff04f845" exitCode=0 Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.253525 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" event={"ID":"9e623096-32ac-4373-b477-1a8bfcc4a137","Type":"ContainerDied","Data":"fc9c2942b2821de4568976774fe67fcccb9bec8b576f74232c26a558ff04f845"} Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.253558 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" event={"ID":"9e623096-32ac-4373-b477-1a8bfcc4a137","Type":"ContainerStarted","Data":"e5a7e1456e8ddf4a5f40a86906e1675e3497cae23fe4180d0bce9b911bd2ec43"} Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.284176 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"056e0424-1faf-4d5a-8aea-e351214b3394","Type":"ContainerStarted","Data":"a430d3871513bb54995f16dfbfbd68f6c4eea1ac3526ace3bbb6cfbaf3cd82db"} Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.290612 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"879d3e20-6df1-45be-bebd-b7e990e0aa5f","Type":"ContainerStarted","Data":"caed46f3492af0ea073b1ebfa689d90a6974f9be12e47dc25331f7f42fa198b2"} Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.291039 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" podUID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerName="dnsmasq-dns" containerID="cri-o://2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52" gracePeriod=10 Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.323107 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.290494029 podStartE2EDuration="37.323090121s" podCreationTimestamp="2025-09-30 12:37:13 +0000 UTC" firstStartedPulling="2025-09-30 12:37:32.963204067 +0000 UTC m=+944.232441713" lastFinishedPulling="2025-09-30 12:37:39.995800159 +0000 UTC m=+951.265037805" observedRunningTime="2025-09-30 12:37:50.321744537 +0000 UTC m=+961.590982183" watchObservedRunningTime="2025-09-30 12:37:50.323090121 +0000 UTC m=+961.592327767" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.623377 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:50 crc kubenswrapper[4672]: E0930 12:37:50.623597 4672 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 12:37:50 crc kubenswrapper[4672]: E0930 12:37:50.623797 4672 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 12:37:50 crc kubenswrapper[4672]: E0930 12:37:50.623845 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift podName:3cc34662-d100-4436-9067-c615b7b3f83f nodeName:}" failed. No retries permitted until 2025-09-30 12:37:51.623830645 +0000 UTC m=+962.893068291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift") pod "swift-storage-0" (UID: "3cc34662-d100-4436-9067-c615b7b3f83f") : configmap "swift-ring-files" not found Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.748292 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.825367 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-sb\") pod \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.825406 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-nb\") pod \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.825466 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkh77\" (UniqueName: \"kubernetes.io/projected/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-kube-api-access-lkh77\") pod \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.825528 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-dns-svc\") pod \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.825978 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-config\") pod \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\" (UID: \"cf8b16b6-06e4-43b1-b72e-bb882ff5da17\") " Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.835184 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-kube-api-access-lkh77" (OuterVolumeSpecName: "kube-api-access-lkh77") pod "cf8b16b6-06e4-43b1-b72e-bb882ff5da17" (UID: "cf8b16b6-06e4-43b1-b72e-bb882ff5da17"). InnerVolumeSpecName "kube-api-access-lkh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.887896 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf8b16b6-06e4-43b1-b72e-bb882ff5da17" (UID: "cf8b16b6-06e4-43b1-b72e-bb882ff5da17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.928582 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.928624 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkh77\" (UniqueName: \"kubernetes.io/projected/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-kube-api-access-lkh77\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.970244 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-config" (OuterVolumeSpecName: "config") pod "cf8b16b6-06e4-43b1-b72e-bb882ff5da17" (UID: "cf8b16b6-06e4-43b1-b72e-bb882ff5da17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.979974 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf8b16b6-06e4-43b1-b72e-bb882ff5da17" (UID: "cf8b16b6-06e4-43b1-b72e-bb882ff5da17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:50 crc kubenswrapper[4672]: I0930 12:37:50.981080 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf8b16b6-06e4-43b1-b72e-bb882ff5da17" (UID: "cf8b16b6-06e4-43b1-b72e-bb882ff5da17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.029752 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.029789 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.029798 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf8b16b6-06e4-43b1-b72e-bb882ff5da17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.300898 4672 generic.go:334] "Generic (PLEG): container finished" podID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerID="2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52" exitCode=0 Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.300980 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.300977 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" event={"ID":"cf8b16b6-06e4-43b1-b72e-bb882ff5da17","Type":"ContainerDied","Data":"2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52"} Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.301881 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7555fbdd8f-68hzf" event={"ID":"cf8b16b6-06e4-43b1-b72e-bb882ff5da17","Type":"ContainerDied","Data":"2d1edd5a96ecbd49dd5b374b57e59e65979dd0d7733dea1ebe36a32e6cdd5a16"} Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.301908 4672 scope.go:117] "RemoveContainer" containerID="2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.304328 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" event={"ID":"9e623096-32ac-4373-b477-1a8bfcc4a137","Type":"ContainerStarted","Data":"9218a27f1ea35ce71baa8fb8de4f974c2328655f8eff994c47dbbea51ec1fa07"} Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.304502 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.309129 4672 generic.go:334] "Generic (PLEG): container finished" podID="854a642c-e6c7-4859-8667-b64f9b54a872" containerID="baa06c426eba60d45465ff8502ff26b5d4b4ca2356b7db98bc5977e364fb8c66" exitCode=0 Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.309187 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerDied","Data":"baa06c426eba60d45465ff8502ff26b5d4b4ca2356b7db98bc5977e364fb8c66"} Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.314331 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"879d3e20-6df1-45be-bebd-b7e990e0aa5f","Type":"ContainerStarted","Data":"20dab577b119d0bac5286af80b9792b5532b163fb67becbaec55a954bc5427c4"} Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.314371 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"879d3e20-6df1-45be-bebd-b7e990e0aa5f","Type":"ContainerStarted","Data":"596919c1990c51e132e1926219efbceadad36ba29c8b70b1446b8b9a789a23e4"} Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.315169 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.323383 4672 scope.go:117] "RemoveContainer" containerID="f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.370424 4672 scope.go:117] "RemoveContainer" containerID="2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52" Sep 30 12:37:51 crc kubenswrapper[4672]: E0930 12:37:51.371142 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52\": container with ID starting with 2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52 not found: ID does not exist" containerID="2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.371217 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52"} err="failed to get container status \"2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52\": rpc error: code = NotFound desc = could not find container \"2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52\": container with ID starting with 2c32dcf9cf85dafe3c4bcb137e332bb5de6f64a61c85edb0514cd0881b94cc52 not found: ID does not exist" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.371250 4672 scope.go:117] "RemoveContainer" containerID="f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.373297 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" podStartSLOduration=3.373279651 podStartE2EDuration="3.373279651s" podCreationTimestamp="2025-09-30 12:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:37:51.334664706 +0000 UTC m=+962.603902362" watchObservedRunningTime="2025-09-30 12:37:51.373279651 +0000 UTC m=+962.642517287" Sep 30 12:37:51 crc kubenswrapper[4672]: E0930 12:37:51.377549 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1\": container with ID starting with f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1 not found: ID does not exist" containerID="f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.377607 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1"} err="failed to get container status \"f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1\": rpc error: code = NotFound desc = could not find container \"f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1\": container with ID starting with f307297ef0aeebf02dc921e446034290aaa71c44ed690f5ebe14720b6adeb5a1 not found: ID does not exist" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.387441 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7555fbdd8f-68hzf"] Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.404109 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7555fbdd8f-68hzf"] Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.408388 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.718560017 podStartE2EDuration="3.408368317s" podCreationTimestamp="2025-09-30 12:37:48 +0000 UTC" firstStartedPulling="2025-09-30 12:37:49.314413178 +0000 UTC m=+960.583650824" lastFinishedPulling="2025-09-30 12:37:50.004221478 +0000 UTC m=+961.273459124" observedRunningTime="2025-09-30 12:37:51.398844286 +0000 UTC m=+962.668081932" watchObservedRunningTime="2025-09-30 12:37:51.408368317 +0000 UTC m=+962.677605963" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.426405 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27236a15-64c2-46be-9186-02c66426c933" path="/var/lib/kubelet/pods/27236a15-64c2-46be-9186-02c66426c933/volumes" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.426963 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" path="/var/lib/kubelet/pods/cf8b16b6-06e4-43b1-b72e-bb882ff5da17/volumes" Sep 30 12:37:51 crc kubenswrapper[4672]: I0930 12:37:51.639132 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:51 crc kubenswrapper[4672]: E0930 12:37:51.639387 4672 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 12:37:51 crc kubenswrapper[4672]: E0930 12:37:51.639421 4672 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 12:37:51 crc kubenswrapper[4672]: E0930 12:37:51.639495 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift podName:3cc34662-d100-4436-9067-c615b7b3f83f nodeName:}" failed. No retries permitted until 2025-09-30 12:37:53.639471013 +0000 UTC m=+964.908708659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift") pod "swift-storage-0" (UID: "3cc34662-d100-4436-9067-c615b7b3f83f") : configmap "swift-ring-files" not found Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.675639 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:53 crc kubenswrapper[4672]: E0930 12:37:53.675852 4672 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 12:37:53 crc kubenswrapper[4672]: E0930 12:37:53.676079 4672 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 12:37:53 crc kubenswrapper[4672]: E0930 12:37:53.676144 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift podName:3cc34662-d100-4436-9067-c615b7b3f83f nodeName:}" failed. No retries permitted until 2025-09-30 12:37:57.676120523 +0000 UTC m=+968.945358169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift") pod "swift-storage-0" (UID: "3cc34662-d100-4436-9067-c615b7b3f83f") : configmap "swift-ring-files" not found Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.795380 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-d7dm7"] Sep 30 12:37:53 crc kubenswrapper[4672]: E0930 12:37:53.795839 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerName="init" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.795869 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerName="init" Sep 30 12:37:53 crc kubenswrapper[4672]: E0930 12:37:53.795899 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerName="dnsmasq-dns" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.795909 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerName="dnsmasq-dns" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.796112 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8b16b6-06e4-43b1-b72e-bb882ff5da17" containerName="dnsmasq-dns" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.796899 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.812283 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.812841 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.813090 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.825044 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d7dm7"] Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.880254 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-dispersionconf\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.880347 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-combined-ca-bundle\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.880379 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-swiftconf\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.880445 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stpm7\" (UniqueName: \"kubernetes.io/projected/4b567440-2a47-4032-bd02-6d7d53ea35b8-kube-api-access-stpm7\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.880475 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b567440-2a47-4032-bd02-6d7d53ea35b8-etc-swift\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.880553 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-ring-data-devices\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.880603 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-scripts\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.982482 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-ring-data-devices\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.982848 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-scripts\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.982902 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-dispersionconf\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.982936 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-combined-ca-bundle\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.982955 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-swiftconf\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.983001 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stpm7\" (UniqueName: \"kubernetes.io/projected/4b567440-2a47-4032-bd02-6d7d53ea35b8-kube-api-access-stpm7\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.983044 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b567440-2a47-4032-bd02-6d7d53ea35b8-etc-swift\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.983506 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-ring-data-devices\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.983603 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b567440-2a47-4032-bd02-6d7d53ea35b8-etc-swift\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.984622 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-scripts\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.990373 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-dispersionconf\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.990626 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-combined-ca-bundle\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:53 crc kubenswrapper[4672]: I0930 12:37:53.991723 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-swiftconf\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:54 crc kubenswrapper[4672]: I0930 12:37:54.001411 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stpm7\" (UniqueName: \"kubernetes.io/projected/4b567440-2a47-4032-bd02-6d7d53ea35b8-kube-api-access-stpm7\") pod \"swift-ring-rebalance-d7dm7\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:54 crc kubenswrapper[4672]: I0930 12:37:54.124643 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:37:54 crc kubenswrapper[4672]: I0930 12:37:54.569480 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d7dm7"] Sep 30 12:37:54 crc kubenswrapper[4672]: I0930 12:37:54.739599 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:37:54 crc kubenswrapper[4672]: I0930 12:37:54.739668 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:37:55 crc kubenswrapper[4672]: I0930 12:37:55.246570 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 12:37:55 crc kubenswrapper[4672]: I0930 12:37:55.246674 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 12:37:55 crc kubenswrapper[4672]: I0930 12:37:55.332205 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 12:37:55 crc kubenswrapper[4672]: I0930 12:37:55.351415 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d7dm7" event={"ID":"4b567440-2a47-4032-bd02-6d7d53ea35b8","Type":"ContainerStarted","Data":"6c8dbe297c9163646c3e504a6a5d14f4a137de197dc0e36c79d76c6fe83eef08"} Sep 30 12:37:55 crc kubenswrapper[4672]: I0930 12:37:55.429071 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.642188 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.642671 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.645378 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-tqggx"] Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.646903 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tqggx" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.666735 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tqggx"] Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.732395 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsgfp\" (UniqueName: \"kubernetes.io/projected/a5a3488a-d32c-4723-b27b-7ec68566fba6-kube-api-access-rsgfp\") pod \"keystone-db-create-tqggx\" (UID: \"a5a3488a-d32c-4723-b27b-7ec68566fba6\") " pod="openstack/keystone-db-create-tqggx" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.834171 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsgfp\" (UniqueName: \"kubernetes.io/projected/a5a3488a-d32c-4723-b27b-7ec68566fba6-kube-api-access-rsgfp\") pod \"keystone-db-create-tqggx\" (UID: \"a5a3488a-d32c-4723-b27b-7ec68566fba6\") " pod="openstack/keystone-db-create-tqggx" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.853410 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsgfp\" (UniqueName: \"kubernetes.io/projected/a5a3488a-d32c-4723-b27b-7ec68566fba6-kube-api-access-rsgfp\") pod \"keystone-db-create-tqggx\" (UID: \"a5a3488a-d32c-4723-b27b-7ec68566fba6\") " pod="openstack/keystone-db-create-tqggx" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.889767 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-9sjc4"] Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.891206 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9sjc4" Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.900241 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9sjc4"] Sep 30 12:37:56 crc kubenswrapper[4672]: I0930 12:37:56.951515 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kt7b\" (UniqueName: \"kubernetes.io/projected/b7f9649f-28e4-4025-b17b-c21ad09eedee-kube-api-access-9kt7b\") pod \"placement-db-create-9sjc4\" (UID: \"b7f9649f-28e4-4025-b17b-c21ad09eedee\") " pod="openstack/placement-db-create-9sjc4" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.015474 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tqggx" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.053907 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kt7b\" (UniqueName: \"kubernetes.io/projected/b7f9649f-28e4-4025-b17b-c21ad09eedee-kube-api-access-9kt7b\") pod \"placement-db-create-9sjc4\" (UID: \"b7f9649f-28e4-4025-b17b-c21ad09eedee\") " pod="openstack/placement-db-create-9sjc4" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.073547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kt7b\" (UniqueName: \"kubernetes.io/projected/b7f9649f-28e4-4025-b17b-c21ad09eedee-kube-api-access-9kt7b\") pod \"placement-db-create-9sjc4\" (UID: \"b7f9649f-28e4-4025-b17b-c21ad09eedee\") " pod="openstack/placement-db-create-9sjc4" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.212155 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9sjc4" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.326361 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gpr5b"] Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.327770 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gpr5b" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.334149 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gpr5b"] Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.438302 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tqggx"] Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.459710 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm49b\" (UniqueName: \"kubernetes.io/projected/d6da0670-d4cb-4b8c-9a9a-2f82e32a767f-kube-api-access-zm49b\") pod \"glance-db-create-gpr5b\" (UID: \"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f\") " pod="openstack/glance-db-create-gpr5b" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.562076 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm49b\" (UniqueName: \"kubernetes.io/projected/d6da0670-d4cb-4b8c-9a9a-2f82e32a767f-kube-api-access-zm49b\") pod \"glance-db-create-gpr5b\" (UID: \"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f\") " pod="openstack/glance-db-create-gpr5b" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.580583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm49b\" (UniqueName: \"kubernetes.io/projected/d6da0670-d4cb-4b8c-9a9a-2f82e32a767f-kube-api-access-zm49b\") pod \"glance-db-create-gpr5b\" (UID: \"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f\") " pod="openstack/glance-db-create-gpr5b" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.651452 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gpr5b" Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.665196 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9sjc4"] Sep 30 12:37:57 crc kubenswrapper[4672]: W0930 12:37:57.665223 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7f9649f_28e4_4025_b17b_c21ad09eedee.slice/crio-3cc038247238aa1cbfba7a6c1c1e6b48b456096c13cdc83ca5382a129f18d5b0 WatchSource:0}: Error finding container 3cc038247238aa1cbfba7a6c1c1e6b48b456096c13cdc83ca5382a129f18d5b0: Status 404 returned error can't find the container with id 3cc038247238aa1cbfba7a6c1c1e6b48b456096c13cdc83ca5382a129f18d5b0 Sep 30 12:37:57 crc kubenswrapper[4672]: I0930 12:37:57.765227 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:37:57 crc kubenswrapper[4672]: E0930 12:37:57.765462 4672 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 12:37:57 crc kubenswrapper[4672]: E0930 12:37:57.765495 4672 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 12:37:57 crc kubenswrapper[4672]: E0930 12:37:57.765567 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift podName:3cc34662-d100-4436-9067-c615b7b3f83f nodeName:}" failed. No retries permitted until 2025-09-30 12:38:05.765542192 +0000 UTC m=+977.034779838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift") pod "swift-storage-0" (UID: "3cc34662-d100-4436-9067-c615b7b3f83f") : configmap "swift-ring-files" not found Sep 30 12:37:58 crc kubenswrapper[4672]: W0930 12:37:58.070283 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6da0670_d4cb_4b8c_9a9a_2f82e32a767f.slice/crio-34b13821ea519c4f5d56c60c5134d662b60e14b0ec651e74c9ff0e2d010cb7b1 WatchSource:0}: Error finding container 34b13821ea519c4f5d56c60c5134d662b60e14b0ec651e74c9ff0e2d010cb7b1: Status 404 returned error can't find the container with id 34b13821ea519c4f5d56c60c5134d662b60e14b0ec651e74c9ff0e2d010cb7b1 Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.072076 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gpr5b"] Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.381290 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gpr5b" event={"ID":"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f","Type":"ContainerStarted","Data":"34b13821ea519c4f5d56c60c5134d662b60e14b0ec651e74c9ff0e2d010cb7b1"} Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.382470 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9sjc4" event={"ID":"b7f9649f-28e4-4025-b17b-c21ad09eedee","Type":"ContainerStarted","Data":"3cc038247238aa1cbfba7a6c1c1e6b48b456096c13cdc83ca5382a129f18d5b0"} Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.389122 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tqggx" event={"ID":"a5a3488a-d32c-4723-b27b-7ec68566fba6","Type":"ContainerStarted","Data":"8b1c5b8e1df0e37e130a513fe1f601f277a460b880d0ab93f90f7299f38220d7"} Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.722436 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-j56fv"] Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.724364 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-j56fv" Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.735937 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-j56fv"] Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.787113 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jm2g\" (UniqueName: \"kubernetes.io/projected/2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2-kube-api-access-6jm2g\") pod \"watcher-db-create-j56fv\" (UID: \"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2\") " pod="openstack/watcher-db-create-j56fv" Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.889415 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jm2g\" (UniqueName: \"kubernetes.io/projected/2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2-kube-api-access-6jm2g\") pod \"watcher-db-create-j56fv\" (UID: \"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2\") " pod="openstack/watcher-db-create-j56fv" Sep 30 12:37:58 crc kubenswrapper[4672]: I0930 12:37:58.909754 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jm2g\" (UniqueName: \"kubernetes.io/projected/2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2-kube-api-access-6jm2g\") pod \"watcher-db-create-j56fv\" (UID: \"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2\") " pod="openstack/watcher-db-create-j56fv" Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.038358 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.046514 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-j56fv" Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.104334 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dff579849-cdrgm"] Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.104549 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" podUID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerName="dnsmasq-dns" containerID="cri-o://f7da0813cbfb0aed147e8014aaddea2aa24a882e5500ee46152703904bf5950f" gracePeriod=10 Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.399108 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gpr5b" event={"ID":"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f","Type":"ContainerStarted","Data":"2223dd6fc8ffbac6d026daedd107cb8834ea7fcabbd224601c17a1aeb1a95410"} Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.408187 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tqggx" event={"ID":"a5a3488a-d32c-4723-b27b-7ec68566fba6","Type":"ContainerStarted","Data":"42b0421f9782d814de1249138402274f5cdd8afe3f2dd52f8bd5b0e49b42ce23"} Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.416202 4672 generic.go:334] "Generic (PLEG): container finished" podID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerID="f7da0813cbfb0aed147e8014aaddea2aa24a882e5500ee46152703904bf5950f" exitCode=0 Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.416332 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" event={"ID":"9aae022e-aded-4f55-bfa2-9fa792516aac","Type":"ContainerDied","Data":"f7da0813cbfb0aed147e8014aaddea2aa24a882e5500ee46152703904bf5950f"} Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.433187 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-gpr5b" podStartSLOduration=2.433172644 podStartE2EDuration="2.433172644s" podCreationTimestamp="2025-09-30 12:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:37:59.420845293 +0000 UTC m=+970.690082949" watchObservedRunningTime="2025-09-30 12:37:59.433172644 +0000 UTC m=+970.702410290" Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.434689 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9sjc4" event={"ID":"b7f9649f-28e4-4025-b17b-c21ad09eedee","Type":"ContainerStarted","Data":"568dfedaa369dfbd8aa6a30e85bc44fe065bad4bb0fcf83751bc2b792cf9bca9"} Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.476884 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-9sjc4" podStartSLOduration=3.476857907 podStartE2EDuration="3.476857907s" podCreationTimestamp="2025-09-30 12:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:37:59.46983085 +0000 UTC m=+970.739068506" watchObservedRunningTime="2025-09-30 12:37:59.476857907 +0000 UTC m=+970.746095553" Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.488101 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-tqggx" podStartSLOduration=3.488082261 podStartE2EDuration="3.488082261s" podCreationTimestamp="2025-09-30 12:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:37:59.480668873 +0000 UTC m=+970.749906519" watchObservedRunningTime="2025-09-30 12:37:59.488082261 +0000 UTC m=+970.757319907" Sep 30 12:37:59 crc kubenswrapper[4672]: I0930 12:37:59.591789 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-j56fv"] Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.300102 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.367927 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.429797 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6da0670-d4cb-4b8c-9a9a-2f82e32a767f" containerID="2223dd6fc8ffbac6d026daedd107cb8834ea7fcabbd224601c17a1aeb1a95410" exitCode=0 Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.429880 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gpr5b" event={"ID":"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f","Type":"ContainerDied","Data":"2223dd6fc8ffbac6d026daedd107cb8834ea7fcabbd224601c17a1aeb1a95410"} Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.432743 4672 generic.go:334] "Generic (PLEG): container finished" podID="a5a3488a-d32c-4723-b27b-7ec68566fba6" containerID="42b0421f9782d814de1249138402274f5cdd8afe3f2dd52f8bd5b0e49b42ce23" exitCode=0 Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.432803 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tqggx" event={"ID":"a5a3488a-d32c-4723-b27b-7ec68566fba6","Type":"ContainerDied","Data":"42b0421f9782d814de1249138402274f5cdd8afe3f2dd52f8bd5b0e49b42ce23"} Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.434019 4672 generic.go:334] "Generic (PLEG): container finished" podID="b7f9649f-28e4-4025-b17b-c21ad09eedee" containerID="568dfedaa369dfbd8aa6a30e85bc44fe065bad4bb0fcf83751bc2b792cf9bca9" exitCode=0 Sep 30 12:38:00 crc kubenswrapper[4672]: I0930 12:38:00.434417 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9sjc4" event={"ID":"b7f9649f-28e4-4025-b17b-c21ad09eedee","Type":"ContainerDied","Data":"568dfedaa369dfbd8aa6a30e85bc44fe065bad4bb0fcf83751bc2b792cf9bca9"} Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.453455 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" event={"ID":"9aae022e-aded-4f55-bfa2-9fa792516aac","Type":"ContainerDied","Data":"420716377d921dfb9a49bb721ebd9d28486cd87830d8263676c46201f24f315c"} Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.453771 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="420716377d921dfb9a49bb721ebd9d28486cd87830d8263676c46201f24f315c" Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.504251 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.639458 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-config\") pod \"9aae022e-aded-4f55-bfa2-9fa792516aac\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.639660 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-dns-svc\") pod \"9aae022e-aded-4f55-bfa2-9fa792516aac\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.639756 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxqj\" (UniqueName: \"kubernetes.io/projected/9aae022e-aded-4f55-bfa2-9fa792516aac-kube-api-access-gzxqj\") pod \"9aae022e-aded-4f55-bfa2-9fa792516aac\" (UID: \"9aae022e-aded-4f55-bfa2-9fa792516aac\") " Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.646205 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aae022e-aded-4f55-bfa2-9fa792516aac-kube-api-access-gzxqj" (OuterVolumeSpecName: "kube-api-access-gzxqj") pod "9aae022e-aded-4f55-bfa2-9fa792516aac" (UID: "9aae022e-aded-4f55-bfa2-9fa792516aac"). InnerVolumeSpecName "kube-api-access-gzxqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.696953 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aae022e-aded-4f55-bfa2-9fa792516aac" (UID: "9aae022e-aded-4f55-bfa2-9fa792516aac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.704341 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-config" (OuterVolumeSpecName: "config") pod "9aae022e-aded-4f55-bfa2-9fa792516aac" (UID: "9aae022e-aded-4f55-bfa2-9fa792516aac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.741923 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzxqj\" (UniqueName: \"kubernetes.io/projected/9aae022e-aded-4f55-bfa2-9fa792516aac-kube-api-access-gzxqj\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.742252 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:01 crc kubenswrapper[4672]: I0930 12:38:01.742282 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aae022e-aded-4f55-bfa2-9fa792516aac-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:02 crc kubenswrapper[4672]: I0930 12:38:02.469468 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-j56fv" event={"ID":"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2","Type":"ContainerStarted","Data":"cb651fef5e4667cc5162146f4ed32576f946f81b63c3241537ce483982959e70"} Sep 30 12:38:02 crc kubenswrapper[4672]: I0930 12:38:02.469745 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dff579849-cdrgm" Sep 30 12:38:02 crc kubenswrapper[4672]: I0930 12:38:02.501141 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dff579849-cdrgm"] Sep 30 12:38:02 crc kubenswrapper[4672]: I0930 12:38:02.507693 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dff579849-cdrgm"] Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.430765 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aae022e-aded-4f55-bfa2-9fa792516aac" path="/var/lib/kubelet/pods/9aae022e-aded-4f55-bfa2-9fa792516aac/volumes" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.482198 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9sjc4" event={"ID":"b7f9649f-28e4-4025-b17b-c21ad09eedee","Type":"ContainerDied","Data":"3cc038247238aa1cbfba7a6c1c1e6b48b456096c13cdc83ca5382a129f18d5b0"} Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.482280 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc038247238aa1cbfba7a6c1c1e6b48b456096c13cdc83ca5382a129f18d5b0" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.484881 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tqggx" event={"ID":"a5a3488a-d32c-4723-b27b-7ec68566fba6","Type":"ContainerDied","Data":"8b1c5b8e1df0e37e130a513fe1f601f277a460b880d0ab93f90f7299f38220d7"} Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.484920 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1c5b8e1df0e37e130a513fe1f601f277a460b880d0ab93f90f7299f38220d7" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.487118 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gpr5b" event={"ID":"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f","Type":"ContainerDied","Data":"34b13821ea519c4f5d56c60c5134d662b60e14b0ec651e74c9ff0e2d010cb7b1"} Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.487159 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34b13821ea519c4f5d56c60c5134d662b60e14b0ec651e74c9ff0e2d010cb7b1" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.633338 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9sjc4" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.677045 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gpr5b" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.680657 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kt7b\" (UniqueName: \"kubernetes.io/projected/b7f9649f-28e4-4025-b17b-c21ad09eedee-kube-api-access-9kt7b\") pod \"b7f9649f-28e4-4025-b17b-c21ad09eedee\" (UID: \"b7f9649f-28e4-4025-b17b-c21ad09eedee\") " Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.686724 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tqggx" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.688492 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f9649f-28e4-4025-b17b-c21ad09eedee-kube-api-access-9kt7b" (OuterVolumeSpecName: "kube-api-access-9kt7b") pod "b7f9649f-28e4-4025-b17b-c21ad09eedee" (UID: "b7f9649f-28e4-4025-b17b-c21ad09eedee"). InnerVolumeSpecName "kube-api-access-9kt7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.783195 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm49b\" (UniqueName: \"kubernetes.io/projected/d6da0670-d4cb-4b8c-9a9a-2f82e32a767f-kube-api-access-zm49b\") pod \"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f\" (UID: \"d6da0670-d4cb-4b8c-9a9a-2f82e32a767f\") " Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.783376 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsgfp\" (UniqueName: \"kubernetes.io/projected/a5a3488a-d32c-4723-b27b-7ec68566fba6-kube-api-access-rsgfp\") pod \"a5a3488a-d32c-4723-b27b-7ec68566fba6\" (UID: \"a5a3488a-d32c-4723-b27b-7ec68566fba6\") " Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.783874 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kt7b\" (UniqueName: \"kubernetes.io/projected/b7f9649f-28e4-4025-b17b-c21ad09eedee-kube-api-access-9kt7b\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.789703 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.791768 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6da0670-d4cb-4b8c-9a9a-2f82e32a767f-kube-api-access-zm49b" (OuterVolumeSpecName: "kube-api-access-zm49b") pod "d6da0670-d4cb-4b8c-9a9a-2f82e32a767f" (UID: "d6da0670-d4cb-4b8c-9a9a-2f82e32a767f"). InnerVolumeSpecName "kube-api-access-zm49b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.796923 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a3488a-d32c-4723-b27b-7ec68566fba6-kube-api-access-rsgfp" (OuterVolumeSpecName: "kube-api-access-rsgfp") pod "a5a3488a-d32c-4723-b27b-7ec68566fba6" (UID: "a5a3488a-d32c-4723-b27b-7ec68566fba6"). InnerVolumeSpecName "kube-api-access-rsgfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.885959 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm49b\" (UniqueName: \"kubernetes.io/projected/d6da0670-d4cb-4b8c-9a9a-2f82e32a767f-kube-api-access-zm49b\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:03 crc kubenswrapper[4672]: I0930 12:38:03.886393 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsgfp\" (UniqueName: \"kubernetes.io/projected/a5a3488a-d32c-4723-b27b-7ec68566fba6-kube-api-access-rsgfp\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.495639 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerStarted","Data":"2934ad25de8c5191bc745d09dea09116aa584b50437cb4aa7ae6cb3d87af0794"} Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.501706 4672 generic.go:334] "Generic (PLEG): container finished" podID="2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2" containerID="587c158adfe5ea1324c63d3ca415b396faa4cb8e2ca7fa5c7ac02497e329a7aa" exitCode=0 Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.501989 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-j56fv" event={"ID":"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2","Type":"ContainerDied","Data":"587c158adfe5ea1324c63d3ca415b396faa4cb8e2ca7fa5c7ac02497e329a7aa"} Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.504759 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9sjc4" Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.511410 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d7dm7" event={"ID":"4b567440-2a47-4032-bd02-6d7d53ea35b8","Type":"ContainerStarted","Data":"a452038d3b79c2539dac7bf267c8648594e7f2c05a8f050c79ec505dd8c5f88b"} Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.511503 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gpr5b" Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.511569 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tqggx" Sep 30 12:38:04 crc kubenswrapper[4672]: I0930 12:38:04.544915 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-d7dm7" podStartSLOduration=2.6921304729999997 podStartE2EDuration="11.544900507s" podCreationTimestamp="2025-09-30 12:37:53 +0000 UTC" firstStartedPulling="2025-09-30 12:37:54.582618695 +0000 UTC m=+965.851856341" lastFinishedPulling="2025-09-30 12:38:03.435388729 +0000 UTC m=+974.704626375" observedRunningTime="2025-09-30 12:38:04.540425883 +0000 UTC m=+975.809663539" watchObservedRunningTime="2025-09-30 12:38:04.544900507 +0000 UTC m=+975.814138153" Sep 30 12:38:05 crc kubenswrapper[4672]: I0930 12:38:05.822605 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:38:05 crc kubenswrapper[4672]: E0930 12:38:05.822764 4672 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 12:38:05 crc kubenswrapper[4672]: E0930 12:38:05.822932 4672 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 12:38:05 crc kubenswrapper[4672]: E0930 12:38:05.822977 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift podName:3cc34662-d100-4436-9067-c615b7b3f83f nodeName:}" failed. No retries permitted until 2025-09-30 12:38:21.822964068 +0000 UTC m=+993.092201714 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift") pod "swift-storage-0" (UID: "3cc34662-d100-4436-9067-c615b7b3f83f") : configmap "swift-ring-files" not found Sep 30 12:38:05 crc kubenswrapper[4672]: I0930 12:38:05.831010 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-j56fv" Sep 30 12:38:05 crc kubenswrapper[4672]: I0930 12:38:05.924862 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jm2g\" (UniqueName: \"kubernetes.io/projected/2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2-kube-api-access-6jm2g\") pod \"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2\" (UID: \"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2\") " Sep 30 12:38:06 crc kubenswrapper[4672]: I0930 12:38:06.035476 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2-kube-api-access-6jm2g" (OuterVolumeSpecName: "kube-api-access-6jm2g") pod "2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2" (UID: "2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2"). InnerVolumeSpecName "kube-api-access-6jm2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:06 crc kubenswrapper[4672]: I0930 12:38:06.129628 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jm2g\" (UniqueName: \"kubernetes.io/projected/2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2-kube-api-access-6jm2g\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:06 crc kubenswrapper[4672]: I0930 12:38:06.522424 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerStarted","Data":"c4a50f78f9bd7603dd906100c0bad41391fe45e9900dbe4c685f8d553f3d68fa"} Sep 30 12:38:06 crc kubenswrapper[4672]: I0930 12:38:06.524418 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-j56fv" event={"ID":"2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2","Type":"ContainerDied","Data":"cb651fef5e4667cc5162146f4ed32576f946f81b63c3241537ce483982959e70"} Sep 30 12:38:06 crc kubenswrapper[4672]: I0930 12:38:06.524444 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb651fef5e4667cc5162146f4ed32576f946f81b63c3241537ce483982959e70" Sep 30 12:38:06 crc kubenswrapper[4672]: I0930 12:38:06.524489 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-j56fv" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.019251 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bdd5-account-create-vw97t"] Sep 30 12:38:07 crc kubenswrapper[4672]: E0930 12:38:07.019804 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerName="dnsmasq-dns" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.019821 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerName="dnsmasq-dns" Sep 30 12:38:07 crc kubenswrapper[4672]: E0930 12:38:07.019848 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f9649f-28e4-4025-b17b-c21ad09eedee" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.019855 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f9649f-28e4-4025-b17b-c21ad09eedee" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: E0930 12:38:07.019864 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.019872 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: E0930 12:38:07.019881 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6da0670-d4cb-4b8c-9a9a-2f82e32a767f" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.019887 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6da0670-d4cb-4b8c-9a9a-2f82e32a767f" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: E0930 12:38:07.019897 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a3488a-d32c-4723-b27b-7ec68566fba6" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.019903 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a3488a-d32c-4723-b27b-7ec68566fba6" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: E0930 12:38:07.019915 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerName="init" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.019921 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerName="init" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.020075 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.020102 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7f9649f-28e4-4025-b17b-c21ad09eedee" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.020113 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aae022e-aded-4f55-bfa2-9fa792516aac" containerName="dnsmasq-dns" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.020121 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a3488a-d32c-4723-b27b-7ec68566fba6" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.020134 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6da0670-d4cb-4b8c-9a9a-2f82e32a767f" containerName="mariadb-database-create" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.020708 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdd5-account-create-vw97t" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.022938 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.033797 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bdd5-account-create-vw97t"] Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.149097 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x976\" (UniqueName: \"kubernetes.io/projected/98d43e73-0852-419a-88c0-de23410cf8b5-kube-api-access-9x976\") pod \"placement-bdd5-account-create-vw97t\" (UID: \"98d43e73-0852-419a-88c0-de23410cf8b5\") " pod="openstack/placement-bdd5-account-create-vw97t" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.250639 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x976\" (UniqueName: \"kubernetes.io/projected/98d43e73-0852-419a-88c0-de23410cf8b5-kube-api-access-9x976\") pod \"placement-bdd5-account-create-vw97t\" (UID: \"98d43e73-0852-419a-88c0-de23410cf8b5\") " pod="openstack/placement-bdd5-account-create-vw97t" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.289617 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x976\" (UniqueName: \"kubernetes.io/projected/98d43e73-0852-419a-88c0-de23410cf8b5-kube-api-access-9x976\") pod \"placement-bdd5-account-create-vw97t\" (UID: \"98d43e73-0852-419a-88c0-de23410cf8b5\") " pod="openstack/placement-bdd5-account-create-vw97t" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.342785 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdd5-account-create-vw97t" Sep 30 12:38:07 crc kubenswrapper[4672]: I0930 12:38:07.778674 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bdd5-account-create-vw97t"] Sep 30 12:38:07 crc kubenswrapper[4672]: W0930 12:38:07.797406 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98d43e73_0852_419a_88c0_de23410cf8b5.slice/crio-7ef69fe6c2ae7957e7683c21c9630b92117fbf1ba4329409dbd105aa1bbd7eef WatchSource:0}: Error finding container 7ef69fe6c2ae7957e7683c21c9630b92117fbf1ba4329409dbd105aa1bbd7eef: Status 404 returned error can't find the container with id 7ef69fe6c2ae7957e7683c21c9630b92117fbf1ba4329409dbd105aa1bbd7eef Sep 30 12:38:08 crc kubenswrapper[4672]: I0930 12:38:08.544722 4672 generic.go:334] "Generic (PLEG): container finished" podID="98d43e73-0852-419a-88c0-de23410cf8b5" containerID="b187d17de7439ed5f2364a3ddffb81c66e37d36f2d37f2c83dc70016bfee73ca" exitCode=0 Sep 30 12:38:08 crc kubenswrapper[4672]: I0930 12:38:08.544804 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bdd5-account-create-vw97t" event={"ID":"98d43e73-0852-419a-88c0-de23410cf8b5","Type":"ContainerDied","Data":"b187d17de7439ed5f2364a3ddffb81c66e37d36f2d37f2c83dc70016bfee73ca"} Sep 30 12:38:08 crc kubenswrapper[4672]: I0930 12:38:08.545015 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bdd5-account-create-vw97t" event={"ID":"98d43e73-0852-419a-88c0-de23410cf8b5","Type":"ContainerStarted","Data":"7ef69fe6c2ae7957e7683c21c9630b92117fbf1ba4329409dbd105aa1bbd7eef"} Sep 30 12:38:09 crc kubenswrapper[4672]: I0930 12:38:09.897761 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdd5-account-create-vw97t" Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.008978 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x976\" (UniqueName: \"kubernetes.io/projected/98d43e73-0852-419a-88c0-de23410cf8b5-kube-api-access-9x976\") pod \"98d43e73-0852-419a-88c0-de23410cf8b5\" (UID: \"98d43e73-0852-419a-88c0-de23410cf8b5\") " Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.016936 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d43e73-0852-419a-88c0-de23410cf8b5-kube-api-access-9x976" (OuterVolumeSpecName: "kube-api-access-9x976") pod "98d43e73-0852-419a-88c0-de23410cf8b5" (UID: "98d43e73-0852-419a-88c0-de23410cf8b5"). InnerVolumeSpecName "kube-api-access-9x976". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.111540 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x976\" (UniqueName: \"kubernetes.io/projected/98d43e73-0852-419a-88c0-de23410cf8b5-kube-api-access-9x976\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.559770 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bdd5-account-create-vw97t" event={"ID":"98d43e73-0852-419a-88c0-de23410cf8b5","Type":"ContainerDied","Data":"7ef69fe6c2ae7957e7683c21c9630b92117fbf1ba4329409dbd105aa1bbd7eef"} Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.559814 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ef69fe6c2ae7957e7683c21c9630b92117fbf1ba4329409dbd105aa1bbd7eef" Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.559867 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdd5-account-create-vw97t" Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.571955 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerStarted","Data":"42a52a6f6ee54c18936d64fc085f8bbe9f7ff1fc97d0c7b9c0c457d38b7f7338"} Sep 30 12:38:10 crc kubenswrapper[4672]: I0930 12:38:10.598939 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.750816634 podStartE2EDuration="52.59891864s" podCreationTimestamp="2025-09-30 12:37:18 +0000 UTC" firstStartedPulling="2025-09-30 12:37:33.68679245 +0000 UTC m=+944.956030096" lastFinishedPulling="2025-09-30 12:38:09.534894456 +0000 UTC m=+980.804132102" observedRunningTime="2025-09-30 12:38:10.596722444 +0000 UTC m=+981.865960080" watchObservedRunningTime="2025-09-30 12:38:10.59891864 +0000 UTC m=+981.868156286" Sep 30 12:38:11 crc kubenswrapper[4672]: I0930 12:38:11.581030 4672 generic.go:334] "Generic (PLEG): container finished" podID="4b567440-2a47-4032-bd02-6d7d53ea35b8" containerID="a452038d3b79c2539dac7bf267c8648594e7f2c05a8f050c79ec505dd8c5f88b" exitCode=0 Sep 30 12:38:11 crc kubenswrapper[4672]: I0930 12:38:11.581126 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d7dm7" event={"ID":"4b567440-2a47-4032-bd02-6d7d53ea35b8","Type":"ContainerDied","Data":"a452038d3b79c2539dac7bf267c8648594e7f2c05a8f050c79ec505dd8c5f88b"} Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.593923 4672 generic.go:334] "Generic (PLEG): container finished" podID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerID="55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9" exitCode=0 Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.594025 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d795c24-8697-461f-9322-2c23bf7cb49b","Type":"ContainerDied","Data":"55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9"} Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.601259 4672 generic.go:334] "Generic (PLEG): container finished" podID="9aea18e8-190e-470a-9330-a30621c96afd" containerID="e01b88bbf1911514ef8474023e787f93d998277ea95c8dfcfe001656a8fbed44" exitCode=0 Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.601450 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aea18e8-190e-470a-9330-a30621c96afd","Type":"ContainerDied","Data":"e01b88bbf1911514ef8474023e787f93d998277ea95c8dfcfe001656a8fbed44"} Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.613575 4672 generic.go:334] "Generic (PLEG): container finished" podID="d165a3a8-6809-46e5-bd35-895200ab5bfc" containerID="52bb7904fb5094e742f944f6757153dc7d5299b545fb1e4f7fecd172fadc25cf" exitCode=0 Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.613800 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d165a3a8-6809-46e5-bd35-895200ab5bfc","Type":"ContainerDied","Data":"52bb7904fb5094e742f944f6757153dc7d5299b545fb1e4f7fecd172fadc25cf"} Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.717001 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vhs7r" podUID="9f35be26-490e-49db-bd31-32ce35c84fab" containerName="ovn-controller" probeResult="failure" output=< Sep 30 12:38:12 crc kubenswrapper[4672]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 12:38:12 crc kubenswrapper[4672]: > Sep 30 12:38:12 crc kubenswrapper[4672]: I0930 12:38:12.974058 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.066206 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-swiftconf\") pod \"4b567440-2a47-4032-bd02-6d7d53ea35b8\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.066353 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b567440-2a47-4032-bd02-6d7d53ea35b8-etc-swift\") pod \"4b567440-2a47-4032-bd02-6d7d53ea35b8\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.066397 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-dispersionconf\") pod \"4b567440-2a47-4032-bd02-6d7d53ea35b8\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.066438 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-ring-data-devices\") pod \"4b567440-2a47-4032-bd02-6d7d53ea35b8\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.066465 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stpm7\" (UniqueName: \"kubernetes.io/projected/4b567440-2a47-4032-bd02-6d7d53ea35b8-kube-api-access-stpm7\") pod \"4b567440-2a47-4032-bd02-6d7d53ea35b8\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.066490 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-scripts\") pod \"4b567440-2a47-4032-bd02-6d7d53ea35b8\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.066531 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-combined-ca-bundle\") pod \"4b567440-2a47-4032-bd02-6d7d53ea35b8\" (UID: \"4b567440-2a47-4032-bd02-6d7d53ea35b8\") " Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.067324 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4b567440-2a47-4032-bd02-6d7d53ea35b8" (UID: "4b567440-2a47-4032-bd02-6d7d53ea35b8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.067920 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b567440-2a47-4032-bd02-6d7d53ea35b8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4b567440-2a47-4032-bd02-6d7d53ea35b8" (UID: "4b567440-2a47-4032-bd02-6d7d53ea35b8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.072685 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b567440-2a47-4032-bd02-6d7d53ea35b8-kube-api-access-stpm7" (OuterVolumeSpecName: "kube-api-access-stpm7") pod "4b567440-2a47-4032-bd02-6d7d53ea35b8" (UID: "4b567440-2a47-4032-bd02-6d7d53ea35b8"). InnerVolumeSpecName "kube-api-access-stpm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.075882 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4b567440-2a47-4032-bd02-6d7d53ea35b8" (UID: "4b567440-2a47-4032-bd02-6d7d53ea35b8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.090790 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4b567440-2a47-4032-bd02-6d7d53ea35b8" (UID: "4b567440-2a47-4032-bd02-6d7d53ea35b8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.091007 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-scripts" (OuterVolumeSpecName: "scripts") pod "4b567440-2a47-4032-bd02-6d7d53ea35b8" (UID: "4b567440-2a47-4032-bd02-6d7d53ea35b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.093969 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b567440-2a47-4032-bd02-6d7d53ea35b8" (UID: "4b567440-2a47-4032-bd02-6d7d53ea35b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.168837 4672 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.168873 4672 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.168885 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stpm7\" (UniqueName: \"kubernetes.io/projected/4b567440-2a47-4032-bd02-6d7d53ea35b8-kube-api-access-stpm7\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.168896 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b567440-2a47-4032-bd02-6d7d53ea35b8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.168908 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.168916 4672 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4b567440-2a47-4032-bd02-6d7d53ea35b8-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.168923 4672 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4b567440-2a47-4032-bd02-6d7d53ea35b8-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.625348 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aea18e8-190e-470a-9330-a30621c96afd","Type":"ContainerStarted","Data":"eae7ec0f02b75b0ebecb5e8cbca220342abcfd8ad46ef71cd2d79e1275c25be3"} Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.625586 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.629547 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d165a3a8-6809-46e5-bd35-895200ab5bfc","Type":"ContainerStarted","Data":"3a27f363affad2b254373f3a0b0c3fdfa85b9a62aae76bda8e789f84eea3f43a"} Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.629766 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.631636 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d7dm7" event={"ID":"4b567440-2a47-4032-bd02-6d7d53ea35b8","Type":"ContainerDied","Data":"6c8dbe297c9163646c3e504a6a5d14f4a137de197dc0e36c79d76c6fe83eef08"} Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.631680 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c8dbe297c9163646c3e504a6a5d14f4a137de197dc0e36c79d76c6fe83eef08" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.631679 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d7dm7" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.633493 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d795c24-8697-461f-9322-2c23bf7cb49b","Type":"ContainerStarted","Data":"408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3"} Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.634011 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.667977 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.628946593 podStartE2EDuration="1m2.667950724s" podCreationTimestamp="2025-09-30 12:37:11 +0000 UTC" firstStartedPulling="2025-09-30 12:37:33.665677376 +0000 UTC m=+944.934915022" lastFinishedPulling="2025-09-30 12:37:39.704681507 +0000 UTC m=+950.973919153" observedRunningTime="2025-09-30 12:38:13.656976744 +0000 UTC m=+984.926214390" watchObservedRunningTime="2025-09-30 12:38:13.667950724 +0000 UTC m=+984.937188370" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.702616 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=56.313687202 podStartE2EDuration="1m2.702594229s" podCreationTimestamp="2025-09-30 12:37:11 +0000 UTC" firstStartedPulling="2025-09-30 12:37:33.687178649 +0000 UTC m=+944.956416295" lastFinishedPulling="2025-09-30 12:37:40.076085686 +0000 UTC m=+951.345323322" observedRunningTime="2025-09-30 12:38:13.696679128 +0000 UTC m=+984.965916794" watchObservedRunningTime="2025-09-30 12:38:13.702594229 +0000 UTC m=+984.971831875" Sep 30 12:38:13 crc kubenswrapper[4672]: I0930 12:38:13.736919 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.569213112 podStartE2EDuration="1m2.736899025s" podCreationTimestamp="2025-09-30 12:37:11 +0000 UTC" firstStartedPulling="2025-09-30 12:37:32.966406628 +0000 UTC m=+944.235644274" lastFinishedPulling="2025-09-30 12:37:40.134092491 +0000 UTC m=+951.403330187" observedRunningTime="2025-09-30 12:38:13.724829066 +0000 UTC m=+984.994066722" watchObservedRunningTime="2025-09-30 12:38:13.736899025 +0000 UTC m=+985.006136671" Sep 30 12:38:14 crc kubenswrapper[4672]: I0930 12:38:14.954629 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.643751 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5afe-account-create-w6kdx"] Sep 30 12:38:16 crc kubenswrapper[4672]: E0930 12:38:16.644501 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d43e73-0852-419a-88c0-de23410cf8b5" containerName="mariadb-account-create" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.644518 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d43e73-0852-419a-88c0-de23410cf8b5" containerName="mariadb-account-create" Sep 30 12:38:16 crc kubenswrapper[4672]: E0930 12:38:16.644533 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b567440-2a47-4032-bd02-6d7d53ea35b8" containerName="swift-ring-rebalance" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.644540 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b567440-2a47-4032-bd02-6d7d53ea35b8" containerName="swift-ring-rebalance" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.644736 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b567440-2a47-4032-bd02-6d7d53ea35b8" containerName="swift-ring-rebalance" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.644751 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d43e73-0852-419a-88c0-de23410cf8b5" containerName="mariadb-account-create" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.647652 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5afe-account-create-w6kdx" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.656758 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.672045 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5afe-account-create-w6kdx"] Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.734228 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zxd\" (UniqueName: \"kubernetes.io/projected/d49ba1b9-421a-41ad-9c95-49bd94444de0-kube-api-access-b7zxd\") pod \"keystone-5afe-account-create-w6kdx\" (UID: \"d49ba1b9-421a-41ad-9c95-49bd94444de0\") " pod="openstack/keystone-5afe-account-create-w6kdx" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.836074 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zxd\" (UniqueName: \"kubernetes.io/projected/d49ba1b9-421a-41ad-9c95-49bd94444de0-kube-api-access-b7zxd\") pod \"keystone-5afe-account-create-w6kdx\" (UID: \"d49ba1b9-421a-41ad-9c95-49bd94444de0\") " pod="openstack/keystone-5afe-account-create-w6kdx" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.855704 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zxd\" (UniqueName: \"kubernetes.io/projected/d49ba1b9-421a-41ad-9c95-49bd94444de0-kube-api-access-b7zxd\") pod \"keystone-5afe-account-create-w6kdx\" (UID: \"d49ba1b9-421a-41ad-9c95-49bd94444de0\") " pod="openstack/keystone-5afe-account-create-w6kdx" Sep 30 12:38:16 crc kubenswrapper[4672]: I0930 12:38:16.976669 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5afe-account-create-w6kdx" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.333403 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d431-account-create-v9pw9"] Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.334966 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d431-account-create-v9pw9" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.342127 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.343862 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d431-account-create-v9pw9"] Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.410072 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5afe-account-create-w6kdx"] Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.447563 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7hr9\" (UniqueName: \"kubernetes.io/projected/f5f7c9ac-7d3f-4477-819c-d5547e1e641d-kube-api-access-d7hr9\") pod \"glance-d431-account-create-v9pw9\" (UID: \"f5f7c9ac-7d3f-4477-819c-d5547e1e641d\") " pod="openstack/glance-d431-account-create-v9pw9" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.549931 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7hr9\" (UniqueName: \"kubernetes.io/projected/f5f7c9ac-7d3f-4477-819c-d5547e1e641d-kube-api-access-d7hr9\") pod \"glance-d431-account-create-v9pw9\" (UID: \"f5f7c9ac-7d3f-4477-819c-d5547e1e641d\") " pod="openstack/glance-d431-account-create-v9pw9" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.571347 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7hr9\" (UniqueName: \"kubernetes.io/projected/f5f7c9ac-7d3f-4477-819c-d5547e1e641d-kube-api-access-d7hr9\") pod \"glance-d431-account-create-v9pw9\" (UID: \"f5f7c9ac-7d3f-4477-819c-d5547e1e641d\") " pod="openstack/glance-d431-account-create-v9pw9" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.618201 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vhs7r" podUID="9f35be26-490e-49db-bd31-32ce35c84fab" containerName="ovn-controller" probeResult="failure" output=< Sep 30 12:38:17 crc kubenswrapper[4672]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 12:38:17 crc kubenswrapper[4672]: > Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.644363 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.648486 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gb7pq" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.660834 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d431-account-create-v9pw9" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.678393 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5afe-account-create-w6kdx" event={"ID":"d49ba1b9-421a-41ad-9c95-49bd94444de0","Type":"ContainerStarted","Data":"3dcabd74158f182a797f4abe9811f1ebc96b6d0a03bd659a6b40003f62f28103"} Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.678444 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5afe-account-create-w6kdx" event={"ID":"d49ba1b9-421a-41ad-9c95-49bd94444de0","Type":"ContainerStarted","Data":"49998af650fe3b17eeb15d4156760ae21373e8e55b9bfcad67454e915643bf4e"} Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.706623 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5afe-account-create-w6kdx" podStartSLOduration=1.706603854 podStartE2EDuration="1.706603854s" podCreationTimestamp="2025-09-30 12:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:38:17.70448707 +0000 UTC m=+988.973724716" watchObservedRunningTime="2025-09-30 12:38:17.706603854 +0000 UTC m=+988.975841500" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.877477 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vhs7r-config-sxld9"] Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.879135 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.884051 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 12:38:17 crc kubenswrapper[4672]: I0930 12:38:17.895370 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vhs7r-config-sxld9"] Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.064716 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run-ovn\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.064788 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.064830 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kvl5\" (UniqueName: \"kubernetes.io/projected/b54b83cb-86e4-4719-9307-e7fbc998d29a-kube-api-access-6kvl5\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.064919 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-additional-scripts\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.064984 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-log-ovn\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.065026 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-scripts\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.103411 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d431-account-create-v9pw9"] Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166425 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166498 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kvl5\" (UniqueName: \"kubernetes.io/projected/b54b83cb-86e4-4719-9307-e7fbc998d29a-kube-api-access-6kvl5\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166553 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-additional-scripts\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166602 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-log-ovn\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166632 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-scripts\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166709 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run-ovn\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166884 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-log-ovn\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166884 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run-ovn\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.166888 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.167566 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-additional-scripts\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.169560 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-scripts\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.190157 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kvl5\" (UniqueName: \"kubernetes.io/projected/b54b83cb-86e4-4719-9307-e7fbc998d29a-kube-api-access-6kvl5\") pod \"ovn-controller-vhs7r-config-sxld9\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.223393 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.687088 4672 generic.go:334] "Generic (PLEG): container finished" podID="f5f7c9ac-7d3f-4477-819c-d5547e1e641d" containerID="6b63f5c1c5efc931527e000897c4386c666368c95cda2db4a5840e3c5b48f8a4" exitCode=0 Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.687171 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d431-account-create-v9pw9" event={"ID":"f5f7c9ac-7d3f-4477-819c-d5547e1e641d","Type":"ContainerDied","Data":"6b63f5c1c5efc931527e000897c4386c666368c95cda2db4a5840e3c5b48f8a4"} Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.688433 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d431-account-create-v9pw9" event={"ID":"f5f7c9ac-7d3f-4477-819c-d5547e1e641d","Type":"ContainerStarted","Data":"06bdceac62f2f8093429da0c1fef609d9f2a8951e27bc567c3da7aaba1234707"} Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.690185 4672 generic.go:334] "Generic (PLEG): container finished" podID="d49ba1b9-421a-41ad-9c95-49bd94444de0" containerID="3dcabd74158f182a797f4abe9811f1ebc96b6d0a03bd659a6b40003f62f28103" exitCode=0 Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.690251 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5afe-account-create-w6kdx" event={"ID":"d49ba1b9-421a-41ad-9c95-49bd94444de0","Type":"ContainerDied","Data":"3dcabd74158f182a797f4abe9811f1ebc96b6d0a03bd659a6b40003f62f28103"} Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.698120 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vhs7r-config-sxld9"] Sep 30 12:38:18 crc kubenswrapper[4672]: W0930 12:38:18.702603 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb54b83cb_86e4_4719_9307_e7fbc998d29a.slice/crio-865eeae3ec8dcf559fbd4a76ea7d447fb9693a77f4817bc425c9df7d1d4497ff WatchSource:0}: Error finding container 865eeae3ec8dcf559fbd4a76ea7d447fb9693a77f4817bc425c9df7d1d4497ff: Status 404 returned error can't find the container with id 865eeae3ec8dcf559fbd4a76ea7d447fb9693a77f4817bc425c9df7d1d4497ff Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.787375 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-8d32-account-create-bprd4"] Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.788854 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8d32-account-create-bprd4" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.794302 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.799454 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8d32-account-create-bprd4"] Sep 30 12:38:18 crc kubenswrapper[4672]: I0930 12:38:18.981636 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzwbw\" (UniqueName: \"kubernetes.io/projected/09b8a4b9-3b36-4c93-b4fa-694539eb0e2e-kube-api-access-jzwbw\") pod \"watcher-8d32-account-create-bprd4\" (UID: \"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e\") " pod="openstack/watcher-8d32-account-create-bprd4" Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.083507 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzwbw\" (UniqueName: \"kubernetes.io/projected/09b8a4b9-3b36-4c93-b4fa-694539eb0e2e-kube-api-access-jzwbw\") pod \"watcher-8d32-account-create-bprd4\" (UID: \"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e\") " pod="openstack/watcher-8d32-account-create-bprd4" Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.110012 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzwbw\" (UniqueName: \"kubernetes.io/projected/09b8a4b9-3b36-4c93-b4fa-694539eb0e2e-kube-api-access-jzwbw\") pod \"watcher-8d32-account-create-bprd4\" (UID: \"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e\") " pod="openstack/watcher-8d32-account-create-bprd4" Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.149914 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8d32-account-create-bprd4" Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.612414 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8d32-account-create-bprd4"] Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.703554 4672 generic.go:334] "Generic (PLEG): container finished" podID="b54b83cb-86e4-4719-9307-e7fbc998d29a" containerID="ee790b61045bb54e632c53cbc9891b79507a6f4accac1b1ff2d861ef334c97b5" exitCode=0 Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.703616 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r-config-sxld9" event={"ID":"b54b83cb-86e4-4719-9307-e7fbc998d29a","Type":"ContainerDied","Data":"ee790b61045bb54e632c53cbc9891b79507a6f4accac1b1ff2d861ef334c97b5"} Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.703642 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r-config-sxld9" event={"ID":"b54b83cb-86e4-4719-9307-e7fbc998d29a","Type":"ContainerStarted","Data":"865eeae3ec8dcf559fbd4a76ea7d447fb9693a77f4817bc425c9df7d1d4497ff"} Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.706363 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8d32-account-create-bprd4" event={"ID":"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e","Type":"ContainerStarted","Data":"75f22c9e8a3324b97e62a6c0eba7cc4eb29c2f43e1d3d8850749acbe28813bb6"} Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.954642 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:19 crc kubenswrapper[4672]: I0930 12:38:19.960456 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.185702 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5afe-account-create-w6kdx" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.190602 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d431-account-create-v9pw9" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.311723 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7zxd\" (UniqueName: \"kubernetes.io/projected/d49ba1b9-421a-41ad-9c95-49bd94444de0-kube-api-access-b7zxd\") pod \"d49ba1b9-421a-41ad-9c95-49bd94444de0\" (UID: \"d49ba1b9-421a-41ad-9c95-49bd94444de0\") " Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.311818 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7hr9\" (UniqueName: \"kubernetes.io/projected/f5f7c9ac-7d3f-4477-819c-d5547e1e641d-kube-api-access-d7hr9\") pod \"f5f7c9ac-7d3f-4477-819c-d5547e1e641d\" (UID: \"f5f7c9ac-7d3f-4477-819c-d5547e1e641d\") " Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.317135 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49ba1b9-421a-41ad-9c95-49bd94444de0-kube-api-access-b7zxd" (OuterVolumeSpecName: "kube-api-access-b7zxd") pod "d49ba1b9-421a-41ad-9c95-49bd94444de0" (UID: "d49ba1b9-421a-41ad-9c95-49bd94444de0"). InnerVolumeSpecName "kube-api-access-b7zxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.317518 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f7c9ac-7d3f-4477-819c-d5547e1e641d-kube-api-access-d7hr9" (OuterVolumeSpecName: "kube-api-access-d7hr9") pod "f5f7c9ac-7d3f-4477-819c-d5547e1e641d" (UID: "f5f7c9ac-7d3f-4477-819c-d5547e1e641d"). InnerVolumeSpecName "kube-api-access-d7hr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.413865 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7zxd\" (UniqueName: \"kubernetes.io/projected/d49ba1b9-421a-41ad-9c95-49bd94444de0-kube-api-access-b7zxd\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.414109 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7hr9\" (UniqueName: \"kubernetes.io/projected/f5f7c9ac-7d3f-4477-819c-d5547e1e641d-kube-api-access-d7hr9\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.716035 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d431-account-create-v9pw9" event={"ID":"f5f7c9ac-7d3f-4477-819c-d5547e1e641d","Type":"ContainerDied","Data":"06bdceac62f2f8093429da0c1fef609d9f2a8951e27bc567c3da7aaba1234707"} Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.716893 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06bdceac62f2f8093429da0c1fef609d9f2a8951e27bc567c3da7aaba1234707" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.716094 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d431-account-create-v9pw9" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.717579 4672 generic.go:334] "Generic (PLEG): container finished" podID="09b8a4b9-3b36-4c93-b4fa-694539eb0e2e" containerID="51aa37842efa35e5eed9529311a059dc19562e79b0d9db70ef66720087a3d59c" exitCode=0 Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.717644 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8d32-account-create-bprd4" event={"ID":"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e","Type":"ContainerDied","Data":"51aa37842efa35e5eed9529311a059dc19562e79b0d9db70ef66720087a3d59c"} Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.719363 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5afe-account-create-w6kdx" event={"ID":"d49ba1b9-421a-41ad-9c95-49bd94444de0","Type":"ContainerDied","Data":"49998af650fe3b17eeb15d4156760ae21373e8e55b9bfcad67454e915643bf4e"} Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.719426 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49998af650fe3b17eeb15d4156760ae21373e8e55b9bfcad67454e915643bf4e" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.719495 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5afe-account-create-w6kdx" Sep 30 12:38:20 crc kubenswrapper[4672]: I0930 12:38:20.721349 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.120336 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.225674 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-scripts\") pod \"b54b83cb-86e4-4719-9307-e7fbc998d29a\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.225768 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-additional-scripts\") pod \"b54b83cb-86e4-4719-9307-e7fbc998d29a\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.225794 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-log-ovn\") pod \"b54b83cb-86e4-4719-9307-e7fbc998d29a\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.225841 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run-ovn\") pod \"b54b83cb-86e4-4719-9307-e7fbc998d29a\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.225931 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run\") pod \"b54b83cb-86e4-4719-9307-e7fbc998d29a\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.225953 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kvl5\" (UniqueName: \"kubernetes.io/projected/b54b83cb-86e4-4719-9307-e7fbc998d29a-kube-api-access-6kvl5\") pod \"b54b83cb-86e4-4719-9307-e7fbc998d29a\" (UID: \"b54b83cb-86e4-4719-9307-e7fbc998d29a\") " Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.226028 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b54b83cb-86e4-4719-9307-e7fbc998d29a" (UID: "b54b83cb-86e4-4719-9307-e7fbc998d29a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.226039 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b54b83cb-86e4-4719-9307-e7fbc998d29a" (UID: "b54b83cb-86e4-4719-9307-e7fbc998d29a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.226070 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run" (OuterVolumeSpecName: "var-run") pod "b54b83cb-86e4-4719-9307-e7fbc998d29a" (UID: "b54b83cb-86e4-4719-9307-e7fbc998d29a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.226323 4672 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.226343 4672 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.226353 4672 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b54b83cb-86e4-4719-9307-e7fbc998d29a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.226702 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b54b83cb-86e4-4719-9307-e7fbc998d29a" (UID: "b54b83cb-86e4-4719-9307-e7fbc998d29a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.227088 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-scripts" (OuterVolumeSpecName: "scripts") pod "b54b83cb-86e4-4719-9307-e7fbc998d29a" (UID: "b54b83cb-86e4-4719-9307-e7fbc998d29a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.232142 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54b83cb-86e4-4719-9307-e7fbc998d29a-kube-api-access-6kvl5" (OuterVolumeSpecName: "kube-api-access-6kvl5") pod "b54b83cb-86e4-4719-9307-e7fbc998d29a" (UID: "b54b83cb-86e4-4719-9307-e7fbc998d29a"). InnerVolumeSpecName "kube-api-access-6kvl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.328087 4672 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.328121 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kvl5\" (UniqueName: \"kubernetes.io/projected/b54b83cb-86e4-4719-9307-e7fbc998d29a-kube-api-access-6kvl5\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.328134 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b54b83cb-86e4-4719-9307-e7fbc998d29a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.728583 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r-config-sxld9" event={"ID":"b54b83cb-86e4-4719-9307-e7fbc998d29a","Type":"ContainerDied","Data":"865eeae3ec8dcf559fbd4a76ea7d447fb9693a77f4817bc425c9df7d1d4497ff"} Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.728630 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865eeae3ec8dcf559fbd4a76ea7d447fb9693a77f4817bc425c9df7d1d4497ff" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.728654 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-sxld9" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.837423 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.843855 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3cc34662-d100-4436-9067-c615b7b3f83f-etc-swift\") pod \"swift-storage-0\" (UID: \"3cc34662-d100-4436-9067-c615b7b3f83f\") " pod="openstack/swift-storage-0" Sep 30 12:38:21 crc kubenswrapper[4672]: I0930 12:38:21.956190 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.115110 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8d32-account-create-bprd4" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.233008 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vhs7r-config-sxld9"] Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.248045 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vhs7r-config-sxld9"] Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.250359 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzwbw\" (UniqueName: \"kubernetes.io/projected/09b8a4b9-3b36-4c93-b4fa-694539eb0e2e-kube-api-access-jzwbw\") pod \"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e\" (UID: \"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e\") " Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.273990 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b8a4b9-3b36-4c93-b4fa-694539eb0e2e-kube-api-access-jzwbw" (OuterVolumeSpecName: "kube-api-access-jzwbw") pod "09b8a4b9-3b36-4c93-b4fa-694539eb0e2e" (UID: "09b8a4b9-3b36-4c93-b4fa-694539eb0e2e"). InnerVolumeSpecName "kube-api-access-jzwbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339177 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vhs7r-config-r2gbp"] Sep 30 12:38:22 crc kubenswrapper[4672]: E0930 12:38:22.339518 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b8a4b9-3b36-4c93-b4fa-694539eb0e2e" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339533 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b8a4b9-3b36-4c93-b4fa-694539eb0e2e" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: E0930 12:38:22.339545 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d49ba1b9-421a-41ad-9c95-49bd94444de0" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339553 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49ba1b9-421a-41ad-9c95-49bd94444de0" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: E0930 12:38:22.339571 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f7c9ac-7d3f-4477-819c-d5547e1e641d" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339577 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f7c9ac-7d3f-4477-819c-d5547e1e641d" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: E0930 12:38:22.339587 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54b83cb-86e4-4719-9307-e7fbc998d29a" containerName="ovn-config" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339592 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54b83cb-86e4-4719-9307-e7fbc998d29a" containerName="ovn-config" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339746 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54b83cb-86e4-4719-9307-e7fbc998d29a" containerName="ovn-config" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339761 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f7c9ac-7d3f-4477-819c-d5547e1e641d" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339773 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d49ba1b9-421a-41ad-9c95-49bd94444de0" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.339784 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b8a4b9-3b36-4c93-b4fa-694539eb0e2e" containerName="mariadb-account-create" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.340898 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.342664 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.351802 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzwbw\" (UniqueName: \"kubernetes.io/projected/09b8a4b9-3b36-4c93-b4fa-694539eb0e2e-kube-api-access-jzwbw\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.353739 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vhs7r-config-r2gbp"] Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.453162 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-additional-scripts\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.453226 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.453542 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzs4p\" (UniqueName: \"kubernetes.io/projected/0f4f502e-922b-483d-a736-067eaeed5569-kube-api-access-jzs4p\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.453675 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-log-ovn\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.453720 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-scripts\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.453822 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run-ovn\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.516140 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.554801 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-4ddgl"] Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.558832 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.570352 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.570626 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzs4p\" (UniqueName: \"kubernetes.io/projected/0f4f502e-922b-483d-a736-067eaeed5569-kube-api-access-jzs4p\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.570746 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-log-ovn\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.570795 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-scripts\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.570887 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run-ovn\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.571024 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-additional-scripts\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.571054 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-log-ovn\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.571082 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run-ovn\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.571025 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.571685 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-additional-scripts\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.572986 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-scripts\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.574821 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8npkv" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.574946 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.586564 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4ddgl"] Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.602784 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzs4p\" (UniqueName: \"kubernetes.io/projected/0f4f502e-922b-483d-a736-067eaeed5569-kube-api-access-jzs4p\") pod \"ovn-controller-vhs7r-config-r2gbp\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.638064 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vhs7r" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.657184 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.673009 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-config-data\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.673493 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-combined-ca-bundle\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.673547 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhwg\" (UniqueName: \"kubernetes.io/projected/3cb781ee-9755-4258-bd4a-165461961834-kube-api-access-kbhwg\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.673615 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-db-sync-config-data\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.705556 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 12:38:22 crc kubenswrapper[4672]: W0930 12:38:22.716545 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cc34662_d100_4436_9067_c615b7b3f83f.slice/crio-338cdee2bfb6d84c4561c8611bd7e6609991b4d5c6fa540ae67f384f65518dae WatchSource:0}: Error finding container 338cdee2bfb6d84c4561c8611bd7e6609991b4d5c6fa540ae67f384f65518dae: Status 404 returned error can't find the container with id 338cdee2bfb6d84c4561c8611bd7e6609991b4d5c6fa540ae67f384f65518dae Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.737775 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"338cdee2bfb6d84c4561c8611bd7e6609991b4d5c6fa540ae67f384f65518dae"} Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.739049 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8d32-account-create-bprd4" event={"ID":"09b8a4b9-3b36-4c93-b4fa-694539eb0e2e","Type":"ContainerDied","Data":"75f22c9e8a3324b97e62a6c0eba7cc4eb29c2f43e1d3d8850749acbe28813bb6"} Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.739069 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f22c9e8a3324b97e62a6c0eba7cc4eb29c2f43e1d3d8850749acbe28813bb6" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.739132 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8d32-account-create-bprd4" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.776653 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-config-data\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.777159 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-combined-ca-bundle\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.777194 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhwg\" (UniqueName: \"kubernetes.io/projected/3cb781ee-9755-4258-bd4a-165461961834-kube-api-access-kbhwg\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.777242 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-db-sync-config-data\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.786543 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-db-sync-config-data\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.786557 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-config-data\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.786690 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-combined-ca-bundle\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.797448 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhwg\" (UniqueName: \"kubernetes.io/projected/3cb781ee-9755-4258-bd4a-165461961834-kube-api-access-kbhwg\") pod \"glance-db-sync-4ddgl\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.894468 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:22 crc kubenswrapper[4672]: I0930 12:38:22.921463 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="d165a3a8-6809-46e5-bd35-895200ab5bfc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Sep 30 12:38:23 crc kubenswrapper[4672]: I0930 12:38:23.170458 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vhs7r-config-r2gbp"] Sep 30 12:38:23 crc kubenswrapper[4672]: I0930 12:38:23.269075 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9aea18e8-190e-470a-9330-a30621c96afd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Sep 30 12:38:23 crc kubenswrapper[4672]: I0930 12:38:23.363820 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4ddgl"] Sep 30 12:38:23 crc kubenswrapper[4672]: I0930 12:38:23.428467 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54b83cb-86e4-4719-9307-e7fbc998d29a" path="/var/lib/kubelet/pods/b54b83cb-86e4-4719-9307-e7fbc998d29a/volumes" Sep 30 12:38:23 crc kubenswrapper[4672]: W0930 12:38:23.606096 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f4f502e_922b_483d_a736_067eaeed5569.slice/crio-359a14e1d21aefc376e56107933a489b7b51b56fcd9b5edd88a1df30b55db2eb WatchSource:0}: Error finding container 359a14e1d21aefc376e56107933a489b7b51b56fcd9b5edd88a1df30b55db2eb: Status 404 returned error can't find the container with id 359a14e1d21aefc376e56107933a489b7b51b56fcd9b5edd88a1df30b55db2eb Sep 30 12:38:23 crc kubenswrapper[4672]: I0930 12:38:23.752611 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r-config-r2gbp" event={"ID":"0f4f502e-922b-483d-a736-067eaeed5569","Type":"ContainerStarted","Data":"359a14e1d21aefc376e56107933a489b7b51b56fcd9b5edd88a1df30b55db2eb"} Sep 30 12:38:23 crc kubenswrapper[4672]: I0930 12:38:23.755526 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ddgl" event={"ID":"3cb781ee-9755-4258-bd4a-165461961834","Type":"ContainerStarted","Data":"5d0e433e0b46aa517a07f495fc679d65dc5d070999037f8725484e7a26fa1323"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.299988 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.300662 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="prometheus" containerID="cri-o://2934ad25de8c5191bc745d09dea09116aa584b50437cb4aa7ae6cb3d87af0794" gracePeriod=600 Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.300738 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="config-reloader" containerID="cri-o://c4a50f78f9bd7603dd906100c0bad41391fe45e9900dbe4c685f8d553f3d68fa" gracePeriod=600 Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.300751 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="thanos-sidecar" containerID="cri-o://42a52a6f6ee54c18936d64fc085f8bbe9f7ff1fc97d0c7b9c0c457d38b7f7338" gracePeriod=600 Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.739562 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.739942 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.772298 4672 generic.go:334] "Generic (PLEG): container finished" podID="854a642c-e6c7-4859-8667-b64f9b54a872" containerID="42a52a6f6ee54c18936d64fc085f8bbe9f7ff1fc97d0c7b9c0c457d38b7f7338" exitCode=0 Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.772335 4672 generic.go:334] "Generic (PLEG): container finished" podID="854a642c-e6c7-4859-8667-b64f9b54a872" containerID="c4a50f78f9bd7603dd906100c0bad41391fe45e9900dbe4c685f8d553f3d68fa" exitCode=0 Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.772344 4672 generic.go:334] "Generic (PLEG): container finished" podID="854a642c-e6c7-4859-8667-b64f9b54a872" containerID="2934ad25de8c5191bc745d09dea09116aa584b50437cb4aa7ae6cb3d87af0794" exitCode=0 Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.772412 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerDied","Data":"42a52a6f6ee54c18936d64fc085f8bbe9f7ff1fc97d0c7b9c0c457d38b7f7338"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.772503 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerDied","Data":"c4a50f78f9bd7603dd906100c0bad41391fe45e9900dbe4c685f8d553f3d68fa"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.772517 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerDied","Data":"2934ad25de8c5191bc745d09dea09116aa584b50437cb4aa7ae6cb3d87af0794"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.775761 4672 generic.go:334] "Generic (PLEG): container finished" podID="0f4f502e-922b-483d-a736-067eaeed5569" containerID="ea25e628b0692f4a89ec401553909cda0a636e4beb6528d5202d826964a1948d" exitCode=0 Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.775829 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r-config-r2gbp" event={"ID":"0f4f502e-922b-483d-a736-067eaeed5569","Type":"ContainerDied","Data":"ea25e628b0692f4a89ec401553909cda0a636e4beb6528d5202d826964a1948d"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.784996 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"2f496e98a3413b9a5e60169bf81d6d3ebfdaffee0f7157fa2e2ddbef5d855593"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.785050 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"87d451e0e65bd42e38b9121879ed935b4dceb415cb75bb61cfa5d46f3400d5a4"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.785063 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"0499d8f4a31c06a57e220e1d6cbdca30925281861b3773109e34c0dd38be48fb"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.785077 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"d282e1e78675876aea075ae0e48226b28e52c3d722638926e4e8160ab9f44a26"} Sep 30 12:38:24 crc kubenswrapper[4672]: I0930 12:38:24.955249 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": dial tcp 10.217.0.113:9090: connect: connection refused" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.557774 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.750739 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-web-config\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.751245 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.751395 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-tls-assets\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.751454 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq72c\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-kube-api-access-rq72c\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.751531 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-thanos-prometheus-http-client-file\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.751566 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-config\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.751604 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/854a642c-e6c7-4859-8667-b64f9b54a872-config-out\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.751658 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/854a642c-e6c7-4859-8667-b64f9b54a872-prometheus-metric-storage-rulefiles-0\") pod \"854a642c-e6c7-4859-8667-b64f9b54a872\" (UID: \"854a642c-e6c7-4859-8667-b64f9b54a872\") " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.752941 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854a642c-e6c7-4859-8667-b64f9b54a872-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.758848 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-config" (OuterVolumeSpecName: "config") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.765568 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.771987 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.775191 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-kube-api-access-rq72c" (OuterVolumeSpecName: "kube-api-access-rq72c") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "kube-api-access-rq72c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.778424 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/854a642c-e6c7-4859-8667-b64f9b54a872-config-out" (OuterVolumeSpecName: "config-out") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.781425 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-web-config" (OuterVolumeSpecName: "web-config") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.796354 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "854a642c-e6c7-4859-8667-b64f9b54a872" (UID: "854a642c-e6c7-4859-8667-b64f9b54a872"). InnerVolumeSpecName "pvc-590f2ab7-9519-4427-af7c-904176dbd8cb". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.831868 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.832417 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"854a642c-e6c7-4859-8667-b64f9b54a872","Type":"ContainerDied","Data":"93e5f067d880d9cb316fdf9e80464500acac020cb28941d3717a26711198a940"} Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.832510 4672 scope.go:117] "RemoveContainer" containerID="42a52a6f6ee54c18936d64fc085f8bbe9f7ff1fc97d0c7b9c0c457d38b7f7338" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854161 4672 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/854a642c-e6c7-4859-8667-b64f9b54a872-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854212 4672 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854249 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") on node \"crc\" " Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854276 4672 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854287 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq72c\" (UniqueName: \"kubernetes.io/projected/854a642c-e6c7-4859-8667-b64f9b54a872-kube-api-access-rq72c\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854296 4672 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854307 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/854a642c-e6c7-4859-8667-b64f9b54a872-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.854315 4672 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/854a642c-e6c7-4859-8667-b64f9b54a872-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.885341 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.901969 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.909696 4672 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.909908 4672 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-590f2ab7-9519-4427-af7c-904176dbd8cb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb") on node "crc" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.922305 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:38:25 crc kubenswrapper[4672]: E0930 12:38:25.922658 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="prometheus" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.922674 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="prometheus" Sep 30 12:38:25 crc kubenswrapper[4672]: E0930 12:38:25.922693 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="init-config-reloader" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.922699 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="init-config-reloader" Sep 30 12:38:25 crc kubenswrapper[4672]: E0930 12:38:25.922712 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="config-reloader" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.922720 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="config-reloader" Sep 30 12:38:25 crc kubenswrapper[4672]: E0930 12:38:25.922743 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="thanos-sidecar" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.922749 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="thanos-sidecar" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.922948 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="prometheus" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.922971 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="thanos-sidecar" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.923019 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" containerName="config-reloader" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.925288 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.931517 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.931884 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l8qzv" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.932044 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.932050 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.935656 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.936856 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.942074 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.955780 4672 reconciler_common.go:293] "Volume detached for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:25 crc kubenswrapper[4672]: I0930 12:38:25.976364 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058036 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058087 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9x6r\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-kube-api-access-p9x6r\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058127 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf82624d-33e5-4298-8cd9-53ef50e87f12-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058157 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058207 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058239 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058300 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058374 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058405 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058460 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf82624d-33e5-4298-8cd9-53ef50e87f12-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.058489 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160256 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf82624d-33e5-4298-8cd9-53ef50e87f12-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160329 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160410 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160440 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9x6r\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-kube-api-access-p9x6r\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160469 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf82624d-33e5-4298-8cd9-53ef50e87f12-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160495 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160546 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160580 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160620 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160662 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.160692 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.161727 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf82624d-33e5-4298-8cd9-53ef50e87f12-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.165509 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.167892 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.169775 4672 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.169822 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/13f5ba172743275de48f8b63cc56ba623f099037d0437073c4c58e3661633e39/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.179442 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.179794 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.180296 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.180362 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.192185 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf82624d-33e5-4298-8cd9-53ef50e87f12-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.196986 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.211070 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9x6r\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-kube-api-access-p9x6r\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.216055 4672 scope.go:117] "RemoveContainer" containerID="c4a50f78f9bd7603dd906100c0bad41391fe45e9900dbe4c685f8d553f3d68fa" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.254398 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.261194 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.270855 4672 scope.go:117] "RemoveContainer" containerID="2934ad25de8c5191bc745d09dea09116aa584b50437cb4aa7ae6cb3d87af0794" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.271191 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.274862 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.315742 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.316633 4672 scope.go:117] "RemoveContainer" containerID="baa06c426eba60d45465ff8502ff26b5d4b4ca2356b7db98bc5977e364fb8c66" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.376898 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-additional-scripts\") pod \"0f4f502e-922b-483d-a736-067eaeed5569\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377143 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzs4p\" (UniqueName: \"kubernetes.io/projected/0f4f502e-922b-483d-a736-067eaeed5569-kube-api-access-jzs4p\") pod \"0f4f502e-922b-483d-a736-067eaeed5569\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377255 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run-ovn\") pod \"0f4f502e-922b-483d-a736-067eaeed5569\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377325 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0f4f502e-922b-483d-a736-067eaeed5569" (UID: "0f4f502e-922b-483d-a736-067eaeed5569"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377350 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run\") pod \"0f4f502e-922b-483d-a736-067eaeed5569\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run" (OuterVolumeSpecName: "var-run") pod "0f4f502e-922b-483d-a736-067eaeed5569" (UID: "0f4f502e-922b-483d-a736-067eaeed5569"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377397 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-log-ovn\") pod \"0f4f502e-922b-483d-a736-067eaeed5569\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377422 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0f4f502e-922b-483d-a736-067eaeed5569" (UID: "0f4f502e-922b-483d-a736-067eaeed5569"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377455 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-scripts\") pod \"0f4f502e-922b-483d-a736-067eaeed5569\" (UID: \"0f4f502e-922b-483d-a736-067eaeed5569\") " Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.377715 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0f4f502e-922b-483d-a736-067eaeed5569" (UID: "0f4f502e-922b-483d-a736-067eaeed5569"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.378201 4672 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.378238 4672 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.378251 4672 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.378287 4672 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f4f502e-922b-483d-a736-067eaeed5569-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.378402 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-scripts" (OuterVolumeSpecName: "scripts") pod "0f4f502e-922b-483d-a736-067eaeed5569" (UID: "0f4f502e-922b-483d-a736-067eaeed5569"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.380369 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f4f502e-922b-483d-a736-067eaeed5569-kube-api-access-jzs4p" (OuterVolumeSpecName: "kube-api-access-jzs4p") pod "0f4f502e-922b-483d-a736-067eaeed5569" (UID: "0f4f502e-922b-483d-a736-067eaeed5569"). InnerVolumeSpecName "kube-api-access-jzs4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.479533 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f502e-922b-483d-a736-067eaeed5569-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.479563 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzs4p\" (UniqueName: \"kubernetes.io/projected/0f4f502e-922b-483d-a736-067eaeed5569-kube-api-access-jzs4p\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.814896 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 12:38:26 crc kubenswrapper[4672]: W0930 12:38:26.828317 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf82624d_33e5_4298_8cd9_53ef50e87f12.slice/crio-b32e02bce319b498b182bb6caa3a71d8c75d043cdbfa2f9863d6d1eaa1803627 WatchSource:0}: Error finding container b32e02bce319b498b182bb6caa3a71d8c75d043cdbfa2f9863d6d1eaa1803627: Status 404 returned error can't find the container with id b32e02bce319b498b182bb6caa3a71d8c75d043cdbfa2f9863d6d1eaa1803627 Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.846889 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vhs7r-config-r2gbp" event={"ID":"0f4f502e-922b-483d-a736-067eaeed5569","Type":"ContainerDied","Data":"359a14e1d21aefc376e56107933a489b7b51b56fcd9b5edd88a1df30b55db2eb"} Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.846951 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="359a14e1d21aefc376e56107933a489b7b51b56fcd9b5edd88a1df30b55db2eb" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.846917 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vhs7r-config-r2gbp" Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.847971 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerStarted","Data":"b32e02bce319b498b182bb6caa3a71d8c75d043cdbfa2f9863d6d1eaa1803627"} Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.850696 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"378b68850f0d2d7813584e74c67fc0e0fd8ab2a56106e80c4801920c913592d8"} Sep 30 12:38:26 crc kubenswrapper[4672]: I0930 12:38:26.850745 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"090c73e9814fb142bccdbb9e3842a832d90609adae557e0a554cce3d8440c92e"} Sep 30 12:38:27 crc kubenswrapper[4672]: I0930 12:38:27.356078 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vhs7r-config-r2gbp"] Sep 30 12:38:27 crc kubenswrapper[4672]: I0930 12:38:27.365507 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vhs7r-config-r2gbp"] Sep 30 12:38:27 crc kubenswrapper[4672]: I0930 12:38:27.429442 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f4f502e-922b-483d-a736-067eaeed5569" path="/var/lib/kubelet/pods/0f4f502e-922b-483d-a736-067eaeed5569/volumes" Sep 30 12:38:27 crc kubenswrapper[4672]: I0930 12:38:27.430194 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854a642c-e6c7-4859-8667-b64f9b54a872" path="/var/lib/kubelet/pods/854a642c-e6c7-4859-8667-b64f9b54a872/volumes" Sep 30 12:38:27 crc kubenswrapper[4672]: I0930 12:38:27.873558 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"85f2f810f46653bc17fc739536f88abd01eaefb03f317bf4ac99e1bc618cb040"} Sep 30 12:38:27 crc kubenswrapper[4672]: I0930 12:38:27.873615 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"503466c3061ede6206b3d06d93a073e5af068d325e9a0b6a572a8f8d50d1a902"} Sep 30 12:38:28 crc kubenswrapper[4672]: I0930 12:38:28.886728 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"948efd70567ab6c4f10a4705641c07cc7fe2ffb268a77e98ca36ce705d8451a8"} Sep 30 12:38:28 crc kubenswrapper[4672]: I0930 12:38:28.887057 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"010376d2c2eae796cdbf380068cb7c7f114222d7e499ad8648b1a8353da47e6f"} Sep 30 12:38:29 crc kubenswrapper[4672]: I0930 12:38:29.898174 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerStarted","Data":"86da6325be6a9d6498e87f7cbf2c59e7de3ad5cf48b207dfdc31d82d1b8cf56d"} Sep 30 12:38:29 crc kubenswrapper[4672]: I0930 12:38:29.909080 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"19b4e5ada69eb11aa7e4f91330e41d235edd66a1ab7b63eca10ab6f94549a88d"} Sep 30 12:38:29 crc kubenswrapper[4672]: I0930 12:38:29.909129 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"93866d29d0a4578af8697b4a7872430310521df76905e86b581dee1fafe3c958"} Sep 30 12:38:29 crc kubenswrapper[4672]: I0930 12:38:29.909141 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"7c2b50d4cfc2a8a5d97de1447502762f791b23de9b5171b053908a7b079596ce"} Sep 30 12:38:29 crc kubenswrapper[4672]: I0930 12:38:29.909150 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"4a991f0160aad00ae695b49b25a468e43884b1f47eef81bc3d84785b592b445e"} Sep 30 12:38:30 crc kubenswrapper[4672]: I0930 12:38:30.926972 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3cc34662-d100-4436-9067-c615b7b3f83f","Type":"ContainerStarted","Data":"726f0a8f929ca7c6ad7ef2f6ed686c16356f11c4713493aaa78a4ba4e21e4f1e"} Sep 30 12:38:30 crc kubenswrapper[4672]: I0930 12:38:30.976123 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.5938029 podStartE2EDuration="42.976099364s" podCreationTimestamp="2025-09-30 12:37:48 +0000 UTC" firstStartedPulling="2025-09-30 12:38:22.71885043 +0000 UTC m=+993.988088076" lastFinishedPulling="2025-09-30 12:38:28.101146894 +0000 UTC m=+999.370384540" observedRunningTime="2025-09-30 12:38:30.963497132 +0000 UTC m=+1002.232734778" watchObservedRunningTime="2025-09-30 12:38:30.976099364 +0000 UTC m=+1002.245337010" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.238360 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcd88f6c5-l7hjg"] Sep 30 12:38:31 crc kubenswrapper[4672]: E0930 12:38:31.238815 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4f502e-922b-483d-a736-067eaeed5569" containerName="ovn-config" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.238839 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4f502e-922b-483d-a736-067eaeed5569" containerName="ovn-config" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.239042 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4f502e-922b-483d-a736-067eaeed5569" containerName="ovn-config" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.240152 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.243063 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.260828 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcd88f6c5-l7hjg"] Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.370306 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-svc\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.370537 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-config\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.370583 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-sb\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.370709 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-nb\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.370734 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-swift-storage-0\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.370768 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpntp\" (UniqueName: \"kubernetes.io/projected/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-kube-api-access-tpntp\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.471789 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-svc\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.471834 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-config\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.471880 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-sb\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.472734 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-nb\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.472773 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-swift-storage-0\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.472853 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpntp\" (UniqueName: \"kubernetes.io/projected/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-kube-api-access-tpntp\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.473330 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-config\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.473564 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-nb\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.473584 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-swift-storage-0\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.473592 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-svc\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.474159 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-sb\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.493094 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpntp\" (UniqueName: \"kubernetes.io/projected/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-kube-api-access-tpntp\") pod \"dnsmasq-dns-6fcd88f6c5-l7hjg\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:31 crc kubenswrapper[4672]: I0930 12:38:31.560854 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:32 crc kubenswrapper[4672]: I0930 12:38:32.514253 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Sep 30 12:38:32 crc kubenswrapper[4672]: I0930 12:38:32.920842 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="d165a3a8-6809-46e5-bd35-895200ab5bfc" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.268576 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.647195 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f2ltm"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.651959 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f2ltm" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.664708 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f2ltm"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.724954 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5lwj\" (UniqueName: \"kubernetes.io/projected/f636d1ae-c979-4ed4-b0f2-0a0000504e56-kube-api-access-h5lwj\") pod \"barbican-db-create-f2ltm\" (UID: \"f636d1ae-c979-4ed4-b0f2-0a0000504e56\") " pod="openstack/barbican-db-create-f2ltm" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.757334 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sfwqd"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.758723 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sfwqd" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.784957 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sfwqd"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.836870 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5lwj\" (UniqueName: \"kubernetes.io/projected/f636d1ae-c979-4ed4-b0f2-0a0000504e56-kube-api-access-h5lwj\") pod \"barbican-db-create-f2ltm\" (UID: \"f636d1ae-c979-4ed4-b0f2-0a0000504e56\") " pod="openstack/barbican-db-create-f2ltm" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.836981 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5pn\" (UniqueName: \"kubernetes.io/projected/777dfdaa-1eff-4e53-85bb-03c7912d1b86-kube-api-access-zn5pn\") pod \"cinder-db-create-sfwqd\" (UID: \"777dfdaa-1eff-4e53-85bb-03c7912d1b86\") " pod="openstack/cinder-db-create-sfwqd" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.879388 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5lwj\" (UniqueName: \"kubernetes.io/projected/f636d1ae-c979-4ed4-b0f2-0a0000504e56-kube-api-access-h5lwj\") pod \"barbican-db-create-f2ltm\" (UID: \"f636d1ae-c979-4ed4-b0f2-0a0000504e56\") " pod="openstack/barbican-db-create-f2ltm" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.894753 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hv754"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.896637 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hv754" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.901735 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hv754"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.954906 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hl8kd"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.959029 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.960402 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5pn\" (UniqueName: \"kubernetes.io/projected/777dfdaa-1eff-4e53-85bb-03c7912d1b86-kube-api-access-zn5pn\") pod \"cinder-db-create-sfwqd\" (UID: \"777dfdaa-1eff-4e53-85bb-03c7912d1b86\") " pod="openstack/cinder-db-create-sfwqd" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.960796 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p4g4\" (UniqueName: \"kubernetes.io/projected/6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc-kube-api-access-8p4g4\") pod \"neutron-db-create-hv754\" (UID: \"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc\") " pod="openstack/neutron-db-create-hv754" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.964917 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.965159 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.966461 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nbwb" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.966576 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.973925 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f2ltm" Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.990083 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hl8kd"] Sep 30 12:38:33 crc kubenswrapper[4672]: I0930 12:38:33.999007 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5pn\" (UniqueName: \"kubernetes.io/projected/777dfdaa-1eff-4e53-85bb-03c7912d1b86-kube-api-access-zn5pn\") pod \"cinder-db-create-sfwqd\" (UID: \"777dfdaa-1eff-4e53-85bb-03c7912d1b86\") " pod="openstack/cinder-db-create-sfwqd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.062053 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-combined-ca-bundle\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.062097 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6zwt\" (UniqueName: \"kubernetes.io/projected/ee403f37-87a0-4068-b310-f178eb87ddf4-kube-api-access-h6zwt\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.062220 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p4g4\" (UniqueName: \"kubernetes.io/projected/6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc-kube-api-access-8p4g4\") pod \"neutron-db-create-hv754\" (UID: \"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc\") " pod="openstack/neutron-db-create-hv754" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.062247 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-config-data\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.081093 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p4g4\" (UniqueName: \"kubernetes.io/projected/6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc-kube-api-access-8p4g4\") pod \"neutron-db-create-hv754\" (UID: \"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc\") " pod="openstack/neutron-db-create-hv754" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.083781 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sfwqd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.163320 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-config-data\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.163773 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-combined-ca-bundle\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.163796 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6zwt\" (UniqueName: \"kubernetes.io/projected/ee403f37-87a0-4068-b310-f178eb87ddf4-kube-api-access-h6zwt\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.167092 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-config-data\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.172031 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-combined-ca-bundle\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.183955 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6zwt\" (UniqueName: \"kubernetes.io/projected/ee403f37-87a0-4068-b310-f178eb87ddf4-kube-api-access-h6zwt\") pod \"keystone-db-sync-hl8kd\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.260123 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hv754" Sep 30 12:38:34 crc kubenswrapper[4672]: I0930 12:38:34.276107 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:37 crc kubenswrapper[4672]: I0930 12:38:37.028007 4672 generic.go:334] "Generic (PLEG): container finished" podID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerID="86da6325be6a9d6498e87f7cbf2c59e7de3ad5cf48b207dfdc31d82d1b8cf56d" exitCode=0 Sep 30 12:38:37 crc kubenswrapper[4672]: I0930 12:38:37.028311 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerDied","Data":"86da6325be6a9d6498e87f7cbf2c59e7de3ad5cf48b207dfdc31d82d1b8cf56d"} Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.068518 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerStarted","Data":"1a79e4562c4e332c07785fa3a46d2c31bee965d0a2cf8e35e3549ba2e4379334"} Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.071845 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ddgl" event={"ID":"3cb781ee-9755-4258-bd4a-165461961834","Type":"ContainerStarted","Data":"af2978cc2b1acc0e874ae9b3657f79fdacb65e72f70dffff3bbf43f95a861057"} Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.099699 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-4ddgl" podStartSLOduration=2.5365666190000002 podStartE2EDuration="18.099681065s" podCreationTimestamp="2025-09-30 12:38:22 +0000 UTC" firstStartedPulling="2025-09-30 12:38:23.610838844 +0000 UTC m=+994.880076490" lastFinishedPulling="2025-09-30 12:38:39.17395329 +0000 UTC m=+1010.443190936" observedRunningTime="2025-09-30 12:38:40.092762478 +0000 UTC m=+1011.362000154" watchObservedRunningTime="2025-09-30 12:38:40.099681065 +0000 UTC m=+1011.368918711" Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.104203 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcd88f6c5-l7hjg"] Sep 30 12:38:40 crc kubenswrapper[4672]: W0930 12:38:40.105682 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2fff124_9dc4_46ee_a5fa_9cef98b3eed9.slice/crio-92aaeccd6de8cec79d0e74405638807fc0a2003a87299f42d3e40a1c18614d6d WatchSource:0}: Error finding container 92aaeccd6de8cec79d0e74405638807fc0a2003a87299f42d3e40a1c18614d6d: Status 404 returned error can't find the container with id 92aaeccd6de8cec79d0e74405638807fc0a2003a87299f42d3e40a1c18614d6d Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.225689 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sfwqd"] Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.234429 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hv754"] Sep 30 12:38:40 crc kubenswrapper[4672]: W0930 12:38:40.238789 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee403f37_87a0_4068_b310_f178eb87ddf4.slice/crio-9202d2237377a9adbc3cfde87afa0b5f65f42a1f6fca2e514ce018b01372eb5a WatchSource:0}: Error finding container 9202d2237377a9adbc3cfde87afa0b5f65f42a1f6fca2e514ce018b01372eb5a: Status 404 returned error can't find the container with id 9202d2237377a9adbc3cfde87afa0b5f65f42a1f6fca2e514ce018b01372eb5a Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.243811 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hl8kd"] Sep 30 12:38:40 crc kubenswrapper[4672]: I0930 12:38:40.251955 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f2ltm"] Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.087094 4672 generic.go:334] "Generic (PLEG): container finished" podID="777dfdaa-1eff-4e53-85bb-03c7912d1b86" containerID="7b4d8e172b6acb6370418c6e94dc1b4ab74e6ab4143913675935e49352dd87c8" exitCode=0 Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.087206 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sfwqd" event={"ID":"777dfdaa-1eff-4e53-85bb-03c7912d1b86","Type":"ContainerDied","Data":"7b4d8e172b6acb6370418c6e94dc1b4ab74e6ab4143913675935e49352dd87c8"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.087533 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sfwqd" event={"ID":"777dfdaa-1eff-4e53-85bb-03c7912d1b86","Type":"ContainerStarted","Data":"cf806bf0703049b3c1644a47a4e7d82591a1838c6f9dd28e5bd66d68ed13883b"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.088939 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hl8kd" event={"ID":"ee403f37-87a0-4068-b310-f178eb87ddf4","Type":"ContainerStarted","Data":"9202d2237377a9adbc3cfde87afa0b5f65f42a1f6fca2e514ce018b01372eb5a"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.097667 4672 generic.go:334] "Generic (PLEG): container finished" podID="f636d1ae-c979-4ed4-b0f2-0a0000504e56" containerID="343d7a17c97fa9b0d84265f7cd3a3f7d50155c98e64836dcbcf872159384b2e2" exitCode=0 Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.097829 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f2ltm" event={"ID":"f636d1ae-c979-4ed4-b0f2-0a0000504e56","Type":"ContainerDied","Data":"343d7a17c97fa9b0d84265f7cd3a3f7d50155c98e64836dcbcf872159384b2e2"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.097874 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f2ltm" event={"ID":"f636d1ae-c979-4ed4-b0f2-0a0000504e56","Type":"ContainerStarted","Data":"6873df4639686c485952db2345e9e6ce6c4297b2ac7e0c784cd9f9a140e5b90d"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.104524 4672 generic.go:334] "Generic (PLEG): container finished" podID="6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc" containerID="16e69782ae512be3a27f3c323a122b936e1ffd08f6941ff938944265a3ed4d98" exitCode=0 Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.104611 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hv754" event={"ID":"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc","Type":"ContainerDied","Data":"16e69782ae512be3a27f3c323a122b936e1ffd08f6941ff938944265a3ed4d98"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.104648 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hv754" event={"ID":"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc","Type":"ContainerStarted","Data":"5161560ca7f41f53375c1cc5de4e4cf03585f1630da901730021517d6fe44604"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.117233 4672 generic.go:334] "Generic (PLEG): container finished" podID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerID="f0a4d1c0737f0f60f37572beb59a93b16e9d23c9eca3776125c61425326c05f1" exitCode=0 Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.117475 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" event={"ID":"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9","Type":"ContainerDied","Data":"f0a4d1c0737f0f60f37572beb59a93b16e9d23c9eca3776125c61425326c05f1"} Sep 30 12:38:41 crc kubenswrapper[4672]: I0930 12:38:41.117557 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" event={"ID":"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9","Type":"ContainerStarted","Data":"92aaeccd6de8cec79d0e74405638807fc0a2003a87299f42d3e40a1c18614d6d"} Sep 30 12:38:42 crc kubenswrapper[4672]: I0930 12:38:42.133758 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" event={"ID":"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9","Type":"ContainerStarted","Data":"48aa12cd375c302528a88918d0cb09ef65b3c49af123b8758dbbb074de3bae10"} Sep 30 12:38:42 crc kubenswrapper[4672]: I0930 12:38:42.134295 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:42 crc kubenswrapper[4672]: I0930 12:38:42.139891 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerStarted","Data":"47aaeacfe5c09a9093d0d2b569d0c92744a18d97edfb1848c3bafa3e6462e1ef"} Sep 30 12:38:42 crc kubenswrapper[4672]: I0930 12:38:42.171500 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" podStartSLOduration=11.171479909 podStartE2EDuration="11.171479909s" podCreationTimestamp="2025-09-30 12:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:38:42.16292361 +0000 UTC m=+1013.432161266" watchObservedRunningTime="2025-09-30 12:38:42.171479909 +0000 UTC m=+1013.440717555" Sep 30 12:38:42 crc kubenswrapper[4672]: I0930 12:38:42.517456 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:38:42 crc kubenswrapper[4672]: I0930 12:38:42.922657 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.909725 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-2w94k"] Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.911399 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.915788 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.916722 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-cndwx" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.947389 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-combined-ca-bundle\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.947426 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7sfb\" (UniqueName: \"kubernetes.io/projected/728a5c57-a6b8-4201-bbff-34f2eee54b1a-kube-api-access-c7sfb\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.947464 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-config-data\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.947817 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-db-sync-config-data\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:43 crc kubenswrapper[4672]: I0930 12:38:43.953340 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-2w94k"] Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.048944 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-combined-ca-bundle\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.048978 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7sfb\" (UniqueName: \"kubernetes.io/projected/728a5c57-a6b8-4201-bbff-34f2eee54b1a-kube-api-access-c7sfb\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.049020 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-config-data\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.049084 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-db-sync-config-data\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.054547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-combined-ca-bundle\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.057088 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-config-data\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.062676 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-db-sync-config-data\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.070734 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7sfb\" (UniqueName: \"kubernetes.io/projected/728a5c57-a6b8-4201-bbff-34f2eee54b1a-kube-api-access-c7sfb\") pod \"watcher-db-sync-2w94k\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:44 crc kubenswrapper[4672]: I0930 12:38:44.227943 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-2w94k" Sep 30 12:38:45 crc kubenswrapper[4672]: I0930 12:38:45.983330 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f2ltm" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.013016 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sfwqd" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.018509 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hv754" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.083179 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5lwj\" (UniqueName: \"kubernetes.io/projected/f636d1ae-c979-4ed4-b0f2-0a0000504e56-kube-api-access-h5lwj\") pod \"f636d1ae-c979-4ed4-b0f2-0a0000504e56\" (UID: \"f636d1ae-c979-4ed4-b0f2-0a0000504e56\") " Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.083237 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn5pn\" (UniqueName: \"kubernetes.io/projected/777dfdaa-1eff-4e53-85bb-03c7912d1b86-kube-api-access-zn5pn\") pod \"777dfdaa-1eff-4e53-85bb-03c7912d1b86\" (UID: \"777dfdaa-1eff-4e53-85bb-03c7912d1b86\") " Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.087988 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f636d1ae-c979-4ed4-b0f2-0a0000504e56-kube-api-access-h5lwj" (OuterVolumeSpecName: "kube-api-access-h5lwj") pod "f636d1ae-c979-4ed4-b0f2-0a0000504e56" (UID: "f636d1ae-c979-4ed4-b0f2-0a0000504e56"). InnerVolumeSpecName "kube-api-access-h5lwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.101592 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777dfdaa-1eff-4e53-85bb-03c7912d1b86-kube-api-access-zn5pn" (OuterVolumeSpecName: "kube-api-access-zn5pn") pod "777dfdaa-1eff-4e53-85bb-03c7912d1b86" (UID: "777dfdaa-1eff-4e53-85bb-03c7912d1b86"). InnerVolumeSpecName "kube-api-access-zn5pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.184592 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p4g4\" (UniqueName: \"kubernetes.io/projected/6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc-kube-api-access-8p4g4\") pod \"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc\" (UID: \"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc\") " Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.185436 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5lwj\" (UniqueName: \"kubernetes.io/projected/f636d1ae-c979-4ed4-b0f2-0a0000504e56-kube-api-access-h5lwj\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.185481 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn5pn\" (UniqueName: \"kubernetes.io/projected/777dfdaa-1eff-4e53-85bb-03c7912d1b86-kube-api-access-zn5pn\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.188967 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc-kube-api-access-8p4g4" (OuterVolumeSpecName: "kube-api-access-8p4g4") pod "6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc" (UID: "6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc"). InnerVolumeSpecName "kube-api-access-8p4g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.197078 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hv754" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.197079 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hv754" event={"ID":"6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc","Type":"ContainerDied","Data":"5161560ca7f41f53375c1cc5de4e4cf03585f1630da901730021517d6fe44604"} Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.197408 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5161560ca7f41f53375c1cc5de4e4cf03585f1630da901730021517d6fe44604" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.202481 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerStarted","Data":"fbb61f2821a324820bd79c008b10693dcee866e69aff58ca9caf772f6a483ae4"} Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.204391 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sfwqd" event={"ID":"777dfdaa-1eff-4e53-85bb-03c7912d1b86","Type":"ContainerDied","Data":"cf806bf0703049b3c1644a47a4e7d82591a1838c6f9dd28e5bd66d68ed13883b"} Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.204430 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf806bf0703049b3c1644a47a4e7d82591a1838c6f9dd28e5bd66d68ed13883b" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.204489 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sfwqd" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.206857 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hl8kd" event={"ID":"ee403f37-87a0-4068-b310-f178eb87ddf4","Type":"ContainerStarted","Data":"a2f1002056dc934bfcb723781956fdb01175b6fa04dcbce2958649a062ab332d"} Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.210032 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f2ltm" event={"ID":"f636d1ae-c979-4ed4-b0f2-0a0000504e56","Type":"ContainerDied","Data":"6873df4639686c485952db2345e9e6ce6c4297b2ac7e0c784cd9f9a140e5b90d"} Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.210142 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6873df4639686c485952db2345e9e6ce6c4297b2ac7e0c784cd9f9a140e5b90d" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.210075 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f2ltm" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.233316 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-2w94k"] Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.243316 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.243293965 podStartE2EDuration="21.243293965s" podCreationTimestamp="2025-09-30 12:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:38:46.22779065 +0000 UTC m=+1017.497028296" watchObservedRunningTime="2025-09-30 12:38:46.243293965 +0000 UTC m=+1017.512531611" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.255503 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hl8kd" podStartSLOduration=7.729900965 podStartE2EDuration="13.255485457s" podCreationTimestamp="2025-09-30 12:38:33 +0000 UTC" firstStartedPulling="2025-09-30 12:38:40.248969216 +0000 UTC m=+1011.518206862" lastFinishedPulling="2025-09-30 12:38:45.774553708 +0000 UTC m=+1017.043791354" observedRunningTime="2025-09-30 12:38:46.249264828 +0000 UTC m=+1017.518502474" watchObservedRunningTime="2025-09-30 12:38:46.255485457 +0000 UTC m=+1017.524723103" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.272341 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.287951 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p4g4\" (UniqueName: \"kubernetes.io/projected/6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc-kube-api-access-8p4g4\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.564400 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.642844 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdf9c6749-ssvf6"] Sep 30 12:38:46 crc kubenswrapper[4672]: I0930 12:38:46.643053 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" podUID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerName="dnsmasq-dns" containerID="cri-o://9218a27f1ea35ce71baa8fb8de4f974c2328655f8eff994c47dbbea51ec1fa07" gracePeriod=10 Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.228606 4672 generic.go:334] "Generic (PLEG): container finished" podID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerID="9218a27f1ea35ce71baa8fb8de4f974c2328655f8eff994c47dbbea51ec1fa07" exitCode=0 Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.229006 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" event={"ID":"9e623096-32ac-4373-b477-1a8bfcc4a137","Type":"ContainerDied","Data":"9218a27f1ea35ce71baa8fb8de4f974c2328655f8eff994c47dbbea51ec1fa07"} Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.231763 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-2w94k" event={"ID":"728a5c57-a6b8-4201-bbff-34f2eee54b1a","Type":"ContainerStarted","Data":"6a34361c29c7ed1a9de47c597e1477654b2bec034ecd553ff7c11e76a9657185"} Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.401052 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.520735 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz5pj\" (UniqueName: \"kubernetes.io/projected/9e623096-32ac-4373-b477-1a8bfcc4a137-kube-api-access-vz5pj\") pod \"9e623096-32ac-4373-b477-1a8bfcc4a137\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.520797 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-config\") pod \"9e623096-32ac-4373-b477-1a8bfcc4a137\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.520839 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-nb\") pod \"9e623096-32ac-4373-b477-1a8bfcc4a137\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.520864 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-sb\") pod \"9e623096-32ac-4373-b477-1a8bfcc4a137\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.520895 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-dns-svc\") pod \"9e623096-32ac-4373-b477-1a8bfcc4a137\" (UID: \"9e623096-32ac-4373-b477-1a8bfcc4a137\") " Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.542359 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e623096-32ac-4373-b477-1a8bfcc4a137-kube-api-access-vz5pj" (OuterVolumeSpecName: "kube-api-access-vz5pj") pod "9e623096-32ac-4373-b477-1a8bfcc4a137" (UID: "9e623096-32ac-4373-b477-1a8bfcc4a137"). InnerVolumeSpecName "kube-api-access-vz5pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.572169 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e623096-32ac-4373-b477-1a8bfcc4a137" (UID: "9e623096-32ac-4373-b477-1a8bfcc4a137"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.572832 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e623096-32ac-4373-b477-1a8bfcc4a137" (UID: "9e623096-32ac-4373-b477-1a8bfcc4a137"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.578364 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-config" (OuterVolumeSpecName: "config") pod "9e623096-32ac-4373-b477-1a8bfcc4a137" (UID: "9e623096-32ac-4373-b477-1a8bfcc4a137"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.584794 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e623096-32ac-4373-b477-1a8bfcc4a137" (UID: "9e623096-32ac-4373-b477-1a8bfcc4a137"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.623144 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz5pj\" (UniqueName: \"kubernetes.io/projected/9e623096-32ac-4373-b477-1a8bfcc4a137-kube-api-access-vz5pj\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.623174 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.623184 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.623192 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:47 crc kubenswrapper[4672]: I0930 12:38:47.623201 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e623096-32ac-4373-b477-1a8bfcc4a137-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:48 crc kubenswrapper[4672]: I0930 12:38:48.256619 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" event={"ID":"9e623096-32ac-4373-b477-1a8bfcc4a137","Type":"ContainerDied","Data":"e5a7e1456e8ddf4a5f40a86906e1675e3497cae23fe4180d0bce9b911bd2ec43"} Sep 30 12:38:48 crc kubenswrapper[4672]: I0930 12:38:48.256655 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdf9c6749-ssvf6" Sep 30 12:38:48 crc kubenswrapper[4672]: I0930 12:38:48.256691 4672 scope.go:117] "RemoveContainer" containerID="9218a27f1ea35ce71baa8fb8de4f974c2328655f8eff994c47dbbea51ec1fa07" Sep 30 12:38:48 crc kubenswrapper[4672]: I0930 12:38:48.290535 4672 scope.go:117] "RemoveContainer" containerID="fc9c2942b2821de4568976774fe67fcccb9bec8b576f74232c26a558ff04f845" Sep 30 12:38:48 crc kubenswrapper[4672]: I0930 12:38:48.292459 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fdf9c6749-ssvf6"] Sep 30 12:38:48 crc kubenswrapper[4672]: I0930 12:38:48.298566 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fdf9c6749-ssvf6"] Sep 30 12:38:49 crc kubenswrapper[4672]: I0930 12:38:49.431338 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e623096-32ac-4373-b477-1a8bfcc4a137" path="/var/lib/kubelet/pods/9e623096-32ac-4373-b477-1a8bfcc4a137/volumes" Sep 30 12:38:52 crc kubenswrapper[4672]: I0930 12:38:52.296710 4672 generic.go:334] "Generic (PLEG): container finished" podID="ee403f37-87a0-4068-b310-f178eb87ddf4" containerID="a2f1002056dc934bfcb723781956fdb01175b6fa04dcbce2958649a062ab332d" exitCode=0 Sep 30 12:38:52 crc kubenswrapper[4672]: I0930 12:38:52.296801 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hl8kd" event={"ID":"ee403f37-87a0-4068-b310-f178eb87ddf4","Type":"ContainerDied","Data":"a2f1002056dc934bfcb723781956fdb01175b6fa04dcbce2958649a062ab332d"} Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.306560 4672 generic.go:334] "Generic (PLEG): container finished" podID="3cb781ee-9755-4258-bd4a-165461961834" containerID="af2978cc2b1acc0e874ae9b3657f79fdacb65e72f70dffff3bbf43f95a861057" exitCode=0 Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.306625 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ddgl" event={"ID":"3cb781ee-9755-4258-bd4a-165461961834","Type":"ContainerDied","Data":"af2978cc2b1acc0e874ae9b3657f79fdacb65e72f70dffff3bbf43f95a861057"} Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.716622 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.783083 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-47f4-account-create-cw9r4"] Sep 30 12:38:53 crc kubenswrapper[4672]: E0930 12:38:53.783956 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.783977 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: E0930 12:38:53.783999 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerName="init" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784010 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerName="init" Sep 30 12:38:53 crc kubenswrapper[4672]: E0930 12:38:53.784030 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee403f37-87a0-4068-b310-f178eb87ddf4" containerName="keystone-db-sync" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784038 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee403f37-87a0-4068-b310-f178eb87ddf4" containerName="keystone-db-sync" Sep 30 12:38:53 crc kubenswrapper[4672]: E0930 12:38:53.784070 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerName="dnsmasq-dns" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784080 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerName="dnsmasq-dns" Sep 30 12:38:53 crc kubenswrapper[4672]: E0930 12:38:53.784088 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777dfdaa-1eff-4e53-85bb-03c7912d1b86" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784096 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="777dfdaa-1eff-4e53-85bb-03c7912d1b86" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: E0930 12:38:53.784112 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f636d1ae-c979-4ed4-b0f2-0a0000504e56" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784119 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f636d1ae-c979-4ed4-b0f2-0a0000504e56" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784482 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f636d1ae-c979-4ed4-b0f2-0a0000504e56" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784508 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee403f37-87a0-4068-b310-f178eb87ddf4" containerName="keystone-db-sync" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784523 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e623096-32ac-4373-b477-1a8bfcc4a137" containerName="dnsmasq-dns" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784538 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.784554 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="777dfdaa-1eff-4e53-85bb-03c7912d1b86" containerName="mariadb-database-create" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.785205 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-47f4-account-create-cw9r4" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.788655 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.789364 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-47f4-account-create-cw9r4"] Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.839601 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-config-data\") pod \"ee403f37-87a0-4068-b310-f178eb87ddf4\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.839700 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-combined-ca-bundle\") pod \"ee403f37-87a0-4068-b310-f178eb87ddf4\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.839859 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6zwt\" (UniqueName: \"kubernetes.io/projected/ee403f37-87a0-4068-b310-f178eb87ddf4-kube-api-access-h6zwt\") pod \"ee403f37-87a0-4068-b310-f178eb87ddf4\" (UID: \"ee403f37-87a0-4068-b310-f178eb87ddf4\") " Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.840204 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbjlz\" (UniqueName: \"kubernetes.io/projected/d372bb6e-78aa-4329-b0f7-f2685c342bd3-kube-api-access-kbjlz\") pod \"cinder-47f4-account-create-cw9r4\" (UID: \"d372bb6e-78aa-4329-b0f7-f2685c342bd3\") " pod="openstack/cinder-47f4-account-create-cw9r4" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.849571 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee403f37-87a0-4068-b310-f178eb87ddf4-kube-api-access-h6zwt" (OuterVolumeSpecName: "kube-api-access-h6zwt") pod "ee403f37-87a0-4068-b310-f178eb87ddf4" (UID: "ee403f37-87a0-4068-b310-f178eb87ddf4"). InnerVolumeSpecName "kube-api-access-h6zwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.879892 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee403f37-87a0-4068-b310-f178eb87ddf4" (UID: "ee403f37-87a0-4068-b310-f178eb87ddf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.891568 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-config-data" (OuterVolumeSpecName: "config-data") pod "ee403f37-87a0-4068-b310-f178eb87ddf4" (UID: "ee403f37-87a0-4068-b310-f178eb87ddf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.941724 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbjlz\" (UniqueName: \"kubernetes.io/projected/d372bb6e-78aa-4329-b0f7-f2685c342bd3-kube-api-access-kbjlz\") pod \"cinder-47f4-account-create-cw9r4\" (UID: \"d372bb6e-78aa-4329-b0f7-f2685c342bd3\") " pod="openstack/cinder-47f4-account-create-cw9r4" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.941875 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.941892 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee403f37-87a0-4068-b310-f178eb87ddf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.941905 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6zwt\" (UniqueName: \"kubernetes.io/projected/ee403f37-87a0-4068-b310-f178eb87ddf4-kube-api-access-h6zwt\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.959806 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbjlz\" (UniqueName: \"kubernetes.io/projected/d372bb6e-78aa-4329-b0f7-f2685c342bd3-kube-api-access-kbjlz\") pod \"cinder-47f4-account-create-cw9r4\" (UID: \"d372bb6e-78aa-4329-b0f7-f2685c342bd3\") " pod="openstack/cinder-47f4-account-create-cw9r4" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.973376 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-55f8-account-create-j8z8l"] Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.974451 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-55f8-account-create-j8z8l" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.976388 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 12:38:53 crc kubenswrapper[4672]: I0930 12:38:53.986751 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-55f8-account-create-j8z8l"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.043142 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82f5h\" (UniqueName: \"kubernetes.io/projected/43cef9cd-4246-4707-93c8-2d1e0b49a99c-kube-api-access-82f5h\") pod \"barbican-55f8-account-create-j8z8l\" (UID: \"43cef9cd-4246-4707-93c8-2d1e0b49a99c\") " pod="openstack/barbican-55f8-account-create-j8z8l" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.079964 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fe85-account-create-vdvbq"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.081234 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fe85-account-create-vdvbq" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.083911 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.089855 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fe85-account-create-vdvbq"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.107371 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-47f4-account-create-cw9r4" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.144887 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82f5h\" (UniqueName: \"kubernetes.io/projected/43cef9cd-4246-4707-93c8-2d1e0b49a99c-kube-api-access-82f5h\") pod \"barbican-55f8-account-create-j8z8l\" (UID: \"43cef9cd-4246-4707-93c8-2d1e0b49a99c\") " pod="openstack/barbican-55f8-account-create-j8z8l" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.162872 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82f5h\" (UniqueName: \"kubernetes.io/projected/43cef9cd-4246-4707-93c8-2d1e0b49a99c-kube-api-access-82f5h\") pod \"barbican-55f8-account-create-j8z8l\" (UID: \"43cef9cd-4246-4707-93c8-2d1e0b49a99c\") " pod="openstack/barbican-55f8-account-create-j8z8l" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.246390 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsk6\" (UniqueName: \"kubernetes.io/projected/e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8-kube-api-access-bqsk6\") pod \"neutron-fe85-account-create-vdvbq\" (UID: \"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8\") " pod="openstack/neutron-fe85-account-create-vdvbq" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.294322 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-55f8-account-create-j8z8l" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.318347 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-2w94k" event={"ID":"728a5c57-a6b8-4201-bbff-34f2eee54b1a","Type":"ContainerStarted","Data":"0371a9eaaabf25ee5b460fadf31d64d04b11ff141406d3f2d9776b271f6432d1"} Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.319746 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hl8kd" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.320974 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hl8kd" event={"ID":"ee403f37-87a0-4068-b310-f178eb87ddf4","Type":"ContainerDied","Data":"9202d2237377a9adbc3cfde87afa0b5f65f42a1f6fca2e514ce018b01372eb5a"} Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.321007 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9202d2237377a9adbc3cfde87afa0b5f65f42a1f6fca2e514ce018b01372eb5a" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.343766 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-2w94k" podStartSLOduration=4.006823399 podStartE2EDuration="11.343744576s" podCreationTimestamp="2025-09-30 12:38:43 +0000 UTC" firstStartedPulling="2025-09-30 12:38:46.239870488 +0000 UTC m=+1017.509108134" lastFinishedPulling="2025-09-30 12:38:53.576791665 +0000 UTC m=+1024.846029311" observedRunningTime="2025-09-30 12:38:54.338006939 +0000 UTC m=+1025.607244595" watchObservedRunningTime="2025-09-30 12:38:54.343744576 +0000 UTC m=+1025.612982222" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.349709 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsk6\" (UniqueName: \"kubernetes.io/projected/e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8-kube-api-access-bqsk6\") pod \"neutron-fe85-account-create-vdvbq\" (UID: \"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8\") " pod="openstack/neutron-fe85-account-create-vdvbq" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.369788 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsk6\" (UniqueName: \"kubernetes.io/projected/e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8-kube-api-access-bqsk6\") pod \"neutron-fe85-account-create-vdvbq\" (UID: \"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8\") " pod="openstack/neutron-fe85-account-create-vdvbq" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.404536 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fe85-account-create-vdvbq" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.560338 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-47f4-account-create-cw9r4"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.570063 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d6fdf78c-8x78t"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.573852 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.580069 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d6fdf78c-8x78t"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.657433 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ss5sv"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.667684 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-nb\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.667813 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-config\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.667887 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-swift-storage-0\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.667920 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-svc\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.667990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t229\" (UniqueName: \"kubernetes.io/projected/26e78aa5-04e3-40fe-839b-b753ff7b6e77-kube-api-access-5t229\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.668051 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-sb\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.674297 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ss5sv"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.674464 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.679768 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nbwb" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.680096 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.680251 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.680415 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.739081 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.739134 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.739183 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.739899 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1712933e94420da648f449968b73ced3cfbd2790d2d92518ca79624030de9f70"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.739955 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://1712933e94420da648f449968b73ced3cfbd2790d2d92518ca79624030de9f70" gracePeriod=600 Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.765977 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-548589fd89-v67ld"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.769101 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-svc\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.769187 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t229\" (UniqueName: \"kubernetes.io/projected/26e78aa5-04e3-40fe-839b-b753ff7b6e77-kube-api-access-5t229\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.769229 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-sb\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.769274 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-nb\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.769338 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-config\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.769382 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-swift-storage-0\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.771723 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.790317 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-42559" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.790677 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.790878 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.791655 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-svc\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.792307 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-sb\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.792327 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-nb\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.796976 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-config\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.797166 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-swift-storage-0\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.808087 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-548589fd89-v67ld"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.813044 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878318 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-fernet-keys\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878363 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-config-data\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878398 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-logs\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878420 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-combined-ca-bundle\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878438 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-credential-keys\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878468 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-scripts\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878500 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-config-data\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878521 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-horizon-secret-key\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878544 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpv54\" (UniqueName: \"kubernetes.io/projected/a3ef8453-d983-4c2a-94b0-c25645f83c23-kube-api-access-bpv54\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878566 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l9cb\" (UniqueName: \"kubernetes.io/projected/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-kube-api-access-9l9cb\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.878626 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-scripts\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.885571 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t229\" (UniqueName: \"kubernetes.io/projected/26e78aa5-04e3-40fe-839b-b753ff7b6e77-kube-api-access-5t229\") pod \"dnsmasq-dns-59d6fdf78c-8x78t\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.968720 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9kp7h"] Sep 30 12:38:54 crc kubenswrapper[4672]: E0930 12:38:54.976816 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95794952_d817_48f2_8956_f7a310f8d1d9.slice/crio-1712933e94420da648f449968b73ced3cfbd2790d2d92518ca79624030de9f70.scope\": RecentStats: unable to find data in memory cache]" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.987141 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-scripts\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.989020 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-scripts\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.991403 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.996901 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.996974 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.998165 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zqw22" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.998399 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.998772 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f749887b9-hl4rd"] Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.999573 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-fernet-keys\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.999719 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-config-data\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:54 crc kubenswrapper[4672]: I0930 12:38:54.999848 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-logs\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:54.999888 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-combined-ca-bundle\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.000036 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-credential-keys\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.001084 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-scripts\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.001129 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-config-data\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.001175 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-config-data\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.001221 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-horizon-secret-key\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.001331 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpv54\" (UniqueName: \"kubernetes.io/projected/a3ef8453-d983-4c2a-94b0-c25645f83c23-kube-api-access-bpv54\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.001375 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l9cb\" (UniqueName: \"kubernetes.io/projected/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-kube-api-access-9l9cb\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.002121 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.002615 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-logs\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.016786 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-credential-keys\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.017240 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-scripts\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.029325 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l9cb\" (UniqueName: \"kubernetes.io/projected/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-kube-api-access-9l9cb\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.030927 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-horizon-secret-key\") pod \"horizon-548589fd89-v67ld\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.031780 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-combined-ca-bundle\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.040917 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpv54\" (UniqueName: \"kubernetes.io/projected/a3ef8453-d983-4c2a-94b0-c25645f83c23-kube-api-access-bpv54\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.087423 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9kp7h"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104689 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dcfda15-b815-4733-b3c7-0312473f7355-logs\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104733 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-config-data\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104773 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-config-data\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104875 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-combined-ca-bundle\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104905 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-scripts\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104926 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lpm6\" (UniqueName: \"kubernetes.io/projected/4dcfda15-b815-4733-b3c7-0312473f7355-kube-api-access-6lpm6\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104956 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c87bb11c-a04e-4018-ba41-d628795a926e-logs\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.104980 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-scripts\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.105006 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2sb2\" (UniqueName: \"kubernetes.io/projected/c87bb11c-a04e-4018-ba41-d628795a926e-kube-api-access-t2sb2\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.105024 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dcfda15-b815-4733-b3c7-0312473f7355-horizon-secret-key\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.112430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-config-data\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.126614 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-fernet-keys\") pod \"keystone-bootstrap-ss5sv\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.132835 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f749887b9-hl4rd"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.187844 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d6fdf78c-8x78t"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.209540 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212198 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-combined-ca-bundle\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212260 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-scripts\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212315 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lpm6\" (UniqueName: \"kubernetes.io/projected/4dcfda15-b815-4733-b3c7-0312473f7355-kube-api-access-6lpm6\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212361 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c87bb11c-a04e-4018-ba41-d628795a926e-logs\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212399 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-scripts\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212426 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2sb2\" (UniqueName: \"kubernetes.io/projected/c87bb11c-a04e-4018-ba41-d628795a926e-kube-api-access-t2sb2\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212444 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dcfda15-b815-4733-b3c7-0312473f7355-horizon-secret-key\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212515 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dcfda15-b815-4733-b3c7-0312473f7355-logs\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212542 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-config-data\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.212597 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-config-data\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.213763 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-scripts\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.218929 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dcfda15-b815-4733-b3c7-0312473f7355-logs\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.219099 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c87bb11c-a04e-4018-ba41-d628795a926e-logs\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.222127 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-config-data\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.222665 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-scripts\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.224009 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-combined-ca-bundle\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.224895 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-config-data\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.243943 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lpm6\" (UniqueName: \"kubernetes.io/projected/4dcfda15-b815-4733-b3c7-0312473f7355-kube-api-access-6lpm6\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.244017 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-599d8d65c-6sv7v"] Sep 30 12:38:55 crc kubenswrapper[4672]: E0930 12:38:55.244460 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb781ee-9755-4258-bd4a-165461961834" containerName="glance-db-sync" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.244479 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb781ee-9755-4258-bd4a-165461961834" containerName="glance-db-sync" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.244677 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb781ee-9755-4258-bd4a-165461961834" containerName="glance-db-sync" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.251415 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.252166 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.255394 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dcfda15-b815-4733-b3c7-0312473f7355-horizon-secret-key\") pod \"horizon-6f749887b9-hl4rd\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.259775 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2sb2\" (UniqueName: \"kubernetes.io/projected/c87bb11c-a04e-4018-ba41-d628795a926e-kube-api-access-t2sb2\") pod \"placement-db-sync-9kp7h\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.274729 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.312908 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbhwg\" (UniqueName: \"kubernetes.io/projected/3cb781ee-9755-4258-bd4a-165461961834-kube-api-access-kbhwg\") pod \"3cb781ee-9755-4258-bd4a-165461961834\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.313232 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-config-data\") pod \"3cb781ee-9755-4258-bd4a-165461961834\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.313443 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-combined-ca-bundle\") pod \"3cb781ee-9755-4258-bd4a-165461961834\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.313470 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-db-sync-config-data\") pod \"3cb781ee-9755-4258-bd4a-165461961834\" (UID: \"3cb781ee-9755-4258-bd4a-165461961834\") " Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.317768 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb781ee-9755-4258-bd4a-165461961834-kube-api-access-kbhwg" (OuterVolumeSpecName: "kube-api-access-kbhwg") pod "3cb781ee-9755-4258-bd4a-165461961834" (UID: "3cb781ee-9755-4258-bd4a-165461961834"). InnerVolumeSpecName "kube-api-access-kbhwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.342557 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599d8d65c-6sv7v"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.347150 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3cb781ee-9755-4258-bd4a-165461961834" (UID: "3cb781ee-9755-4258-bd4a-165461961834"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.362050 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-55f8-account-create-j8z8l"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.371620 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-47f4-account-create-cw9r4" event={"ID":"d372bb6e-78aa-4329-b0f7-f2685c342bd3","Type":"ContainerStarted","Data":"d96474d4b2160b6b5c841b736ae86bf936d0e70552b42921a134a127d6ae1e47"} Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.371668 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-47f4-account-create-cw9r4" event={"ID":"d372bb6e-78aa-4329-b0f7-f2685c342bd3","Type":"ContainerStarted","Data":"9e4ecea1848257fd478ca457d9ee54e9052aaee5054e9268f80da68c9171fb51"} Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.393999 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fe85-account-create-vdvbq" event={"ID":"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8","Type":"ContainerStarted","Data":"386667b88e4d0eb7a828c77b62754b3a3b6427f2c88c8a53e47af261ada0199a"} Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.396739 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fe85-account-create-vdvbq"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417098 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r6km\" (UniqueName: \"kubernetes.io/projected/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-kube-api-access-4r6km\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417141 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-nb\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417234 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-swift-storage-0\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417254 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-config\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417300 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-svc\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417326 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-sb\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417514 4672 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417534 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbhwg\" (UniqueName: \"kubernetes.io/projected/3cb781ee-9755-4258-bd4a-165461961834-kube-api-access-kbhwg\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.417676 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="1712933e94420da648f449968b73ced3cfbd2790d2d92518ca79624030de9f70" exitCode=0 Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.420694 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cb781ee-9755-4258-bd4a-165461961834" (UID: "3cb781ee-9755-4258-bd4a-165461961834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.424100 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9kp7h" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.442829 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4ddgl" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.474253 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"1712933e94420da648f449968b73ced3cfbd2790d2d92518ca79624030de9f70"} Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.474321 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"c2e0fa5817adc74311f8929edf2f7fe8a5d38b2926c430c80278916d7abc9d3a"} Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.474369 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.475109 4672 scope.go:117] "RemoveContainer" containerID="f277ec5275d8b860ce8dcb4c0f3ecdd12eaede7bb4ef094f520236f925a1f1a1" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.477531 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4ddgl" event={"ID":"3cb781ee-9755-4258-bd4a-165461961834","Type":"ContainerDied","Data":"5d0e433e0b46aa517a07f495fc679d65dc5d070999037f8725484e7a26fa1323"} Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.477575 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d0e433e0b46aa517a07f495fc679d65dc5d070999037f8725484e7a26fa1323" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.477591 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.477607 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-55f8-account-create-j8z8l" event={"ID":"43cef9cd-4246-4707-93c8-2d1e0b49a99c","Type":"ContainerStarted","Data":"09540849a103063070de27f7dec98a34c14e4311a187872d4ff85c9db4c0427e"} Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.477697 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.481288 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-config-data" (OuterVolumeSpecName: "config-data") pod "3cb781ee-9755-4258-bd4a-165461961834" (UID: "3cb781ee-9755-4258-bd4a-165461961834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.493359 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.493621 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.494173 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.528685 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r6km\" (UniqueName: \"kubernetes.io/projected/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-kube-api-access-4r6km\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.528993 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-nb\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.529100 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-swift-storage-0\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.529125 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-config\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.529179 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-svc\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.529215 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-sb\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.529342 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.529358 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb781ee-9755-4258-bd4a-165461961834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.530674 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-swift-storage-0\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.533956 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-nb\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.534142 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-svc\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.535458 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-config\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.537476 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-sb\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.602616 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r6km\" (UniqueName: \"kubernetes.io/projected/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-kube-api-access-4r6km\") pod \"dnsmasq-dns-599d8d65c-6sv7v\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.639408 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-config-data\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.639522 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-run-httpd\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.639635 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-scripts\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.639684 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.639729 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2qw\" (UniqueName: \"kubernetes.io/projected/a813f7c2-e727-42a0-8779-8619fe6e8165-kube-api-access-2r2qw\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.639752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-log-httpd\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.639780 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.682889 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d6fdf78c-8x78t"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.749726 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-config-data\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.749797 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-run-httpd\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.749841 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-scripts\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.749861 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.749886 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2qw\" (UniqueName: \"kubernetes.io/projected/a813f7c2-e727-42a0-8779-8619fe6e8165-kube-api-access-2r2qw\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.749903 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-log-httpd\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.749920 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.755997 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-run-httpd\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.756209 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-log-httpd\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.761677 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-config-data\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.763231 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.769946 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.781911 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-scripts\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.793976 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2qw\" (UniqueName: \"kubernetes.io/projected/a813f7c2-e727-42a0-8779-8619fe6e8165-kube-api-access-2r2qw\") pod \"ceilometer-0\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.817124 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599d8d65c-6sv7v"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.817955 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.855584 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.893370 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc454b69-wncsb"] Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.895248 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:55 crc kubenswrapper[4672]: I0930 12:38:55.943602 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc454b69-wncsb"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.060240 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-config\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.060676 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.061240 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.061328 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.062400 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-svc\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.062493 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mpj4\" (UniqueName: \"kubernetes.io/projected/984ee7c7-9f78-45ea-876e-82c967e7c4fc-kube-api-access-8mpj4\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.163839 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-svc\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.163883 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mpj4\" (UniqueName: \"kubernetes.io/projected/984ee7c7-9f78-45ea-876e-82c967e7c4fc-kube-api-access-8mpj4\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.163962 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-config\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.163985 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.164022 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.164043 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.164723 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-svc\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.164883 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.167233 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.169876 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-config\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.170956 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-swift-storage-0\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.187172 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mpj4\" (UniqueName: \"kubernetes.io/projected/984ee7c7-9f78-45ea-876e-82c967e7c4fc-kube-api-access-8mpj4\") pod \"dnsmasq-dns-7fc454b69-wncsb\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.271925 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.274037 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ss5sv"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.290092 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.437091 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.521509 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ss5sv" event={"ID":"a3ef8453-d983-4c2a-94b0-c25645f83c23","Type":"ContainerStarted","Data":"a10376c8bde6edabec9e5e9593fdc1fce5625725f0014366dd8c6a1481a2e899"} Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.532734 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-548589fd89-v67ld"] Sep 30 12:38:56 crc kubenswrapper[4672]: W0930 12:38:56.588472 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd596d1_129e_4bb2_9e9b_dac1d09323d2.slice/crio-cbabc5d93198a8af257188261dd554e49bb4683b41e53b0d825d378f3a35bd21 WatchSource:0}: Error finding container cbabc5d93198a8af257188261dd554e49bb4683b41e53b0d825d378f3a35bd21: Status 404 returned error can't find the container with id cbabc5d93198a8af257188261dd554e49bb4683b41e53b0d825d378f3a35bd21 Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.615699 4672 generic.go:334] "Generic (PLEG): container finished" podID="43cef9cd-4246-4707-93c8-2d1e0b49a99c" containerID="c6671192b403ba0381d0c338083170ddad87278a332092c99bbfe68b29def9df" exitCode=0 Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.616276 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-55f8-account-create-j8z8l" event={"ID":"43cef9cd-4246-4707-93c8-2d1e0b49a99c","Type":"ContainerDied","Data":"c6671192b403ba0381d0c338083170ddad87278a332092c99bbfe68b29def9df"} Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.616436 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599d8d65c-6sv7v"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.704563 4672 generic.go:334] "Generic (PLEG): container finished" podID="26e78aa5-04e3-40fe-839b-b753ff7b6e77" containerID="c23062b57a4250ac477935173cb6801b75a51ef6ea9a95a8732a9b350236a917" exitCode=0 Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.704795 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" event={"ID":"26e78aa5-04e3-40fe-839b-b753ff7b6e77","Type":"ContainerDied","Data":"c23062b57a4250ac477935173cb6801b75a51ef6ea9a95a8732a9b350236a917"} Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.704911 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" event={"ID":"26e78aa5-04e3-40fe-839b-b753ff7b6e77","Type":"ContainerStarted","Data":"97e33f045d25c87838a129fc782f3d713e1d0519eacc21acd89efd57b4f99063"} Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.749186 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9kp7h"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.753044 4672 generic.go:334] "Generic (PLEG): container finished" podID="d372bb6e-78aa-4329-b0f7-f2685c342bd3" containerID="d96474d4b2160b6b5c841b736ae86bf936d0e70552b42921a134a127d6ae1e47" exitCode=0 Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.753122 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-47f4-account-create-cw9r4" event={"ID":"d372bb6e-78aa-4329-b0f7-f2685c342bd3","Type":"ContainerDied","Data":"d96474d4b2160b6b5c841b736ae86bf936d0e70552b42921a134a127d6ae1e47"} Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.799661 4672 generic.go:334] "Generic (PLEG): container finished" podID="e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8" containerID="32df13751c16a9753fbac6226c8a88e584bc29b566655afd5f987b5175e2b5bd" exitCode=0 Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.808765 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fe85-account-create-vdvbq" event={"ID":"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8","Type":"ContainerDied","Data":"32df13751c16a9753fbac6226c8a88e584bc29b566655afd5f987b5175e2b5bd"} Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.808810 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.813822 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.820729 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.820898 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8npkv" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.821012 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.822559 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.849243 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.867864 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.891461 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-logs\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.891553 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.891574 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.891607 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.891689 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.891717 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.891751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jppd5\" (UniqueName: \"kubernetes.io/projected/69e757cd-3187-40f9-828a-cec497ae50dd-kube-api-access-jppd5\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.912839 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f749887b9-hl4rd"] Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997052 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997105 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997145 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997177 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997205 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997233 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jppd5\" (UniqueName: \"kubernetes.io/projected/69e757cd-3187-40f9-828a-cec497ae50dd-kube-api-access-jppd5\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997311 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-logs\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.997843 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-logs\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.998972 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Sep 30 12:38:56 crc kubenswrapper[4672]: I0930 12:38:56.999060 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.012295 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-config-data\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.031060 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-scripts\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.041073 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jppd5\" (UniqueName: \"kubernetes.io/projected/69e757cd-3187-40f9-828a-cec497ae50dd-kube-api-access-jppd5\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.065059 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.070452 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " pod="openstack/glance-default-external-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.067879 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.089137 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.100091 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.174672 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.205257 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.206576 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.206778 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.206860 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.206930 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.207032 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.207123 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bqx\" (UniqueName: \"kubernetes.io/projected/c515ed89-0f3e-4a37-b5e9-53602578d30a-kube-api-access-96bqx\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.221638 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc454b69-wncsb"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.222844 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.308718 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.308779 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.308816 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.308845 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96bqx\" (UniqueName: \"kubernetes.io/projected/c515ed89-0f3e-4a37-b5e9-53602578d30a-kube-api-access-96bqx\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.308882 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.308943 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.308977 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.309443 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.312870 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.313132 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.319610 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.320722 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.333390 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.357153 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bqx\" (UniqueName: \"kubernetes.io/projected/c515ed89-0f3e-4a37-b5e9-53602578d30a-kube-api-access-96bqx\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.404989 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.497988 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.658209 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.658253 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-548589fd89-v67ld"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.658278 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-567d7b57-5g9zw"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.659921 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-567d7b57-5g9zw"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.659939 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.660028 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.721022 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.724401 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-config-data\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.724486 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96f92d5d-ba87-4de0-955d-998845ea9010-horizon-secret-key\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.724618 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-scripts\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.724677 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96f92d5d-ba87-4de0-955d-998845ea9010-logs\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.724717 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptk7m\" (UniqueName: \"kubernetes.io/projected/96f92d5d-ba87-4de0-955d-998845ea9010-kube-api-access-ptk7m\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.825037 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerStarted","Data":"3e73427b7aa04fefba3f1788211511545670e1e6c8065a95bfc261c48926476f"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.828428 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptk7m\" (UniqueName: \"kubernetes.io/projected/96f92d5d-ba87-4de0-955d-998845ea9010-kube-api-access-ptk7m\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.828493 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-config-data\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.828563 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96f92d5d-ba87-4de0-955d-998845ea9010-horizon-secret-key\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.828798 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-scripts\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.828925 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96f92d5d-ba87-4de0-955d-998845ea9010-logs\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.829606 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96f92d5d-ba87-4de0-955d-998845ea9010-logs\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.830218 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-scripts\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.831426 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ss5sv" event={"ID":"a3ef8453-d983-4c2a-94b0-c25645f83c23","Type":"ContainerStarted","Data":"3a6ec9c5597a94e6f7c8b9a5c79734eaf39c98e33f98fbae6974f40305f31127"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.834383 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-config-data\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.835460 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96f92d5d-ba87-4de0-955d-998845ea9010-horizon-secret-key\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.862473 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f749887b9-hl4rd" event={"ID":"4dcfda15-b815-4733-b3c7-0312473f7355","Type":"ContainerStarted","Data":"c1064bedad9058b66ba7ec1d2dc92c472f49a11e5389258162c6744694c866f6"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.863145 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptk7m\" (UniqueName: \"kubernetes.io/projected/96f92d5d-ba87-4de0-955d-998845ea9010-kube-api-access-ptk7m\") pod \"horizon-567d7b57-5g9zw\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.864812 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548589fd89-v67ld" event={"ID":"8dd596d1-129e-4bb2-9e9b-dac1d09323d2","Type":"ContainerStarted","Data":"cbabc5d93198a8af257188261dd554e49bb4683b41e53b0d825d378f3a35bd21"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.866607 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" event={"ID":"984ee7c7-9f78-45ea-876e-82c967e7c4fc","Type":"ContainerStarted","Data":"40c1f7df4db717bc0d14d8a9378d6da9d0c8039cca40c36dade8a689226d98a3"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.868196 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9kp7h" event={"ID":"c87bb11c-a04e-4018-ba41-d628795a926e","Type":"ContainerStarted","Data":"0bbc42f823b080f05094ace0f4292c8c11d29c562e8dffad8638e4296e2a9b1d"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.869815 4672 generic.go:334] "Generic (PLEG): container finished" podID="d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" containerID="a14751f83a2d53081ef68658412d7dcb2e5b9006ea5f0fb2dce24b8ae388ba6e" exitCode=0 Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.869893 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" event={"ID":"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f","Type":"ContainerDied","Data":"a14751f83a2d53081ef68658412d7dcb2e5b9006ea5f0fb2dce24b8ae388ba6e"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.869913 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" event={"ID":"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f","Type":"ContainerStarted","Data":"984187a6d437ddde28810599e2b5818eddb2114d78ba8c05d528391084a954bf"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.873230 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" event={"ID":"26e78aa5-04e3-40fe-839b-b753ff7b6e77","Type":"ContainerDied","Data":"97e33f045d25c87838a129fc782f3d713e1d0519eacc21acd89efd57b4f99063"} Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.873257 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97e33f045d25c87838a129fc782f3d713e1d0519eacc21acd89efd57b4f99063" Sep 30 12:38:57 crc kubenswrapper[4672]: I0930 12:38:57.964159 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ss5sv" podStartSLOduration=3.964133287 podStartE2EDuration="3.964133287s" podCreationTimestamp="2025-09-30 12:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:38:57.886623468 +0000 UTC m=+1029.155861124" watchObservedRunningTime="2025-09-30 12:38:57.964133287 +0000 UTC m=+1029.233370933" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.042201 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.060547 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.089765 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-47f4-account-create-cw9r4" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.135251 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-sb\") pod \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.135313 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-nb\") pod \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.135376 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-swift-storage-0\") pod \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.135400 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-svc\") pod \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.135485 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-config\") pod \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.135516 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbjlz\" (UniqueName: \"kubernetes.io/projected/d372bb6e-78aa-4329-b0f7-f2685c342bd3-kube-api-access-kbjlz\") pod \"d372bb6e-78aa-4329-b0f7-f2685c342bd3\" (UID: \"d372bb6e-78aa-4329-b0f7-f2685c342bd3\") " Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.135654 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t229\" (UniqueName: \"kubernetes.io/projected/26e78aa5-04e3-40fe-839b-b753ff7b6e77-kube-api-access-5t229\") pod \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\" (UID: \"26e78aa5-04e3-40fe-839b-b753ff7b6e77\") " Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.147914 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d372bb6e-78aa-4329-b0f7-f2685c342bd3-kube-api-access-kbjlz" (OuterVolumeSpecName: "kube-api-access-kbjlz") pod "d372bb6e-78aa-4329-b0f7-f2685c342bd3" (UID: "d372bb6e-78aa-4329-b0f7-f2685c342bd3"). InnerVolumeSpecName "kube-api-access-kbjlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.149291 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e78aa5-04e3-40fe-839b-b753ff7b6e77-kube-api-access-5t229" (OuterVolumeSpecName: "kube-api-access-5t229") pod "26e78aa5-04e3-40fe-839b-b753ff7b6e77" (UID: "26e78aa5-04e3-40fe-839b-b753ff7b6e77"). InnerVolumeSpecName "kube-api-access-5t229". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.188632 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "26e78aa5-04e3-40fe-839b-b753ff7b6e77" (UID: "26e78aa5-04e3-40fe-839b-b753ff7b6e77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.192079 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "26e78aa5-04e3-40fe-839b-b753ff7b6e77" (UID: "26e78aa5-04e3-40fe-839b-b753ff7b6e77"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.198885 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "26e78aa5-04e3-40fe-839b-b753ff7b6e77" (UID: "26e78aa5-04e3-40fe-839b-b753ff7b6e77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.199677 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26e78aa5-04e3-40fe-839b-b753ff7b6e77" (UID: "26e78aa5-04e3-40fe-839b-b753ff7b6e77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.205840 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-config" (OuterVolumeSpecName: "config") pod "26e78aa5-04e3-40fe-839b-b753ff7b6e77" (UID: "26e78aa5-04e3-40fe-839b-b753ff7b6e77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.242748 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t229\" (UniqueName: \"kubernetes.io/projected/26e78aa5-04e3-40fe-839b-b753ff7b6e77-kube-api-access-5t229\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.242785 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.242796 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.242806 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.242815 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.242826 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e78aa5-04e3-40fe-839b-b753ff7b6e77-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.242837 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbjlz\" (UniqueName: \"kubernetes.io/projected/d372bb6e-78aa-4329-b0f7-f2685c342bd3-kube-api-access-kbjlz\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.866946 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:38:58 crc kubenswrapper[4672]: W0930 12:38:58.885146 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e757cd_3187_40f9_828a_cec497ae50dd.slice/crio-0b89405dab5502bb0806392b897994c53d588e27db7c354bc5d04573781464f1 WatchSource:0}: Error finding container 0b89405dab5502bb0806392b897994c53d588e27db7c354bc5d04573781464f1: Status 404 returned error can't find the container with id 0b89405dab5502bb0806392b897994c53d588e27db7c354bc5d04573781464f1 Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.918991 4672 generic.go:334] "Generic (PLEG): container finished" podID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerID="5ca1a950e67c2b92179249f2326b865b66d17b8817b63536e97df2ac1c3f9297" exitCode=0 Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.919054 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" event={"ID":"984ee7c7-9f78-45ea-876e-82c967e7c4fc","Type":"ContainerDied","Data":"5ca1a950e67c2b92179249f2326b865b66d17b8817b63536e97df2ac1c3f9297"} Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.922440 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-47f4-account-create-cw9r4" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.946351 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-47f4-account-create-cw9r4" event={"ID":"d372bb6e-78aa-4329-b0f7-f2685c342bd3","Type":"ContainerDied","Data":"9e4ecea1848257fd478ca457d9ee54e9052aaee5054e9268f80da68c9171fb51"} Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.946396 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e4ecea1848257fd478ca457d9ee54e9052aaee5054e9268f80da68c9171fb51" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.946457 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d6fdf78c-8x78t" Sep 30 12:38:58 crc kubenswrapper[4672]: I0930 12:38:58.958273 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-567d7b57-5g9zw"] Sep 30 12:38:59 crc kubenswrapper[4672]: W0930 12:38:59.016509 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc515ed89_0f3e_4a37_b5e9_53602578d30a.slice/crio-c7bea0ac4dcd141881ba7b6c29e5ed3daf48b2aac302acee9c45eb7ca2e44c2b WatchSource:0}: Error finding container c7bea0ac4dcd141881ba7b6c29e5ed3daf48b2aac302acee9c45eb7ca2e44c2b: Status 404 returned error can't find the container with id c7bea0ac4dcd141881ba7b6c29e5ed3daf48b2aac302acee9c45eb7ca2e44c2b Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.035295 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.143902 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.160616 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-55f8-account-create-j8z8l" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.163805 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fe85-account-create-vdvbq" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.209799 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d6fdf78c-8x78t"] Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.224416 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d6fdf78c-8x78t"] Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265657 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-config\") pod \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265719 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqsk6\" (UniqueName: \"kubernetes.io/projected/e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8-kube-api-access-bqsk6\") pod \"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8\" (UID: \"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265755 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-svc\") pod \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265789 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-swift-storage-0\") pod \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265840 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-sb\") pod \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265864 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-nb\") pod \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265907 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r6km\" (UniqueName: \"kubernetes.io/projected/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-kube-api-access-4r6km\") pod \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\" (UID: \"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.265997 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82f5h\" (UniqueName: \"kubernetes.io/projected/43cef9cd-4246-4707-93c8-2d1e0b49a99c-kube-api-access-82f5h\") pod \"43cef9cd-4246-4707-93c8-2d1e0b49a99c\" (UID: \"43cef9cd-4246-4707-93c8-2d1e0b49a99c\") " Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.326526 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-kube-api-access-4r6km" (OuterVolumeSpecName: "kube-api-access-4r6km") pod "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" (UID: "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f"). InnerVolumeSpecName "kube-api-access-4r6km". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.326597 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43cef9cd-4246-4707-93c8-2d1e0b49a99c-kube-api-access-82f5h" (OuterVolumeSpecName: "kube-api-access-82f5h") pod "43cef9cd-4246-4707-93c8-2d1e0b49a99c" (UID: "43cef9cd-4246-4707-93c8-2d1e0b49a99c"). InnerVolumeSpecName "kube-api-access-82f5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.326605 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8-kube-api-access-bqsk6" (OuterVolumeSpecName: "kube-api-access-bqsk6") pod "e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8" (UID: "e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8"). InnerVolumeSpecName "kube-api-access-bqsk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.353952 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-config" (OuterVolumeSpecName: "config") pod "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" (UID: "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.367753 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" (UID: "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.370691 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.370860 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r6km\" (UniqueName: \"kubernetes.io/projected/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-kube-api-access-4r6km\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.370924 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82f5h\" (UniqueName: \"kubernetes.io/projected/43cef9cd-4246-4707-93c8-2d1e0b49a99c-kube-api-access-82f5h\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.370994 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.371049 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqsk6\" (UniqueName: \"kubernetes.io/projected/e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8-kube-api-access-bqsk6\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.392712 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" (UID: "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.402377 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" (UID: "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.402911 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" (UID: "d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.443888 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e78aa5-04e3-40fe-839b-b753ff7b6e77" path="/var/lib/kubelet/pods/26e78aa5-04e3-40fe-839b-b753ff7b6e77/volumes" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.473953 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.477176 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.477792 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.965485 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69e757cd-3187-40f9-828a-cec497ae50dd","Type":"ContainerStarted","Data":"0b89405dab5502bb0806392b897994c53d588e27db7c354bc5d04573781464f1"} Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.968051 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567d7b57-5g9zw" event={"ID":"96f92d5d-ba87-4de0-955d-998845ea9010","Type":"ContainerStarted","Data":"80e78aba304d365dd4159b7a468e860978b870437cdfa0fdc8b8f1b55919c926"} Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.972510 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" event={"ID":"984ee7c7-9f78-45ea-876e-82c967e7c4fc","Type":"ContainerStarted","Data":"8693125bc289223634c777a33aeafbcd9eb9a16affbcccfb8d0b7915623cce33"} Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.972619 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.975356 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-55f8-account-create-j8z8l" event={"ID":"43cef9cd-4246-4707-93c8-2d1e0b49a99c","Type":"ContainerDied","Data":"09540849a103063070de27f7dec98a34c14e4311a187872d4ff85c9db4c0427e"} Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.975399 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09540849a103063070de27f7dec98a34c14e4311a187872d4ff85c9db4c0427e" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.975454 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-55f8-account-create-j8z8l" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.983254 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.983258 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599d8d65c-6sv7v" event={"ID":"d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f","Type":"ContainerDied","Data":"984187a6d437ddde28810599e2b5818eddb2114d78ba8c05d528391084a954bf"} Sep 30 12:38:59 crc kubenswrapper[4672]: I0930 12:38:59.983410 4672 scope.go:117] "RemoveContainer" containerID="a14751f83a2d53081ef68658412d7dcb2e5b9006ea5f0fb2dce24b8ae388ba6e" Sep 30 12:39:00 crc kubenswrapper[4672]: I0930 12:39:00.005009 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c515ed89-0f3e-4a37-b5e9-53602578d30a","Type":"ContainerStarted","Data":"c7bea0ac4dcd141881ba7b6c29e5ed3daf48b2aac302acee9c45eb7ca2e44c2b"} Sep 30 12:39:00 crc kubenswrapper[4672]: I0930 12:39:00.009033 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fe85-account-create-vdvbq" event={"ID":"e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8","Type":"ContainerDied","Data":"386667b88e4d0eb7a828c77b62754b3a3b6427f2c88c8a53e47af261ada0199a"} Sep 30 12:39:00 crc kubenswrapper[4672]: I0930 12:39:00.009072 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386667b88e4d0eb7a828c77b62754b3a3b6427f2c88c8a53e47af261ada0199a" Sep 30 12:39:00 crc kubenswrapper[4672]: I0930 12:39:00.009196 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fe85-account-create-vdvbq" Sep 30 12:39:00 crc kubenswrapper[4672]: I0930 12:39:00.013049 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" podStartSLOduration=5.013029166 podStartE2EDuration="5.013029166s" podCreationTimestamp="2025-09-30 12:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:00.000176978 +0000 UTC m=+1031.269414644" watchObservedRunningTime="2025-09-30 12:39:00.013029166 +0000 UTC m=+1031.282266812" Sep 30 12:39:00 crc kubenswrapper[4672]: I0930 12:39:00.076088 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599d8d65c-6sv7v"] Sep 30 12:39:00 crc kubenswrapper[4672]: I0930 12:39:00.086128 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-599d8d65c-6sv7v"] Sep 30 12:39:01 crc kubenswrapper[4672]: I0930 12:39:01.039622 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c515ed89-0f3e-4a37-b5e9-53602578d30a","Type":"ContainerStarted","Data":"50186f10fd640297f77522112529fd398a4bee106ed4a1f07bb8fbe619850b5c"} Sep 30 12:39:01 crc kubenswrapper[4672]: I0930 12:39:01.045778 4672 generic.go:334] "Generic (PLEG): container finished" podID="728a5c57-a6b8-4201-bbff-34f2eee54b1a" containerID="0371a9eaaabf25ee5b460fadf31d64d04b11ff141406d3f2d9776b271f6432d1" exitCode=0 Sep 30 12:39:01 crc kubenswrapper[4672]: I0930 12:39:01.045854 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-2w94k" event={"ID":"728a5c57-a6b8-4201-bbff-34f2eee54b1a","Type":"ContainerDied","Data":"0371a9eaaabf25ee5b460fadf31d64d04b11ff141406d3f2d9776b271f6432d1"} Sep 30 12:39:01 crc kubenswrapper[4672]: I0930 12:39:01.051059 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69e757cd-3187-40f9-828a-cec497ae50dd","Type":"ContainerStarted","Data":"ffa109781cc7820232abe1a36d8cace242c9308a6cb74b5bcd957de3302677bd"} Sep 30 12:39:01 crc kubenswrapper[4672]: I0930 12:39:01.444243 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" path="/var/lib/kubelet/pods/d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f/volumes" Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.074094 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69e757cd-3187-40f9-828a-cec497ae50dd","Type":"ContainerStarted","Data":"7f1ae11fb0ca93ce8fe36b1259620ff1daf25dc16d12766832eb4aed17a8bd37"} Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.074338 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-log" containerID="cri-o://ffa109781cc7820232abe1a36d8cace242c9308a6cb74b5bcd957de3302677bd" gracePeriod=30 Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.075053 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-httpd" containerID="cri-o://7f1ae11fb0ca93ce8fe36b1259620ff1daf25dc16d12766832eb4aed17a8bd37" gracePeriod=30 Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.082416 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-log" containerID="cri-o://50186f10fd640297f77522112529fd398a4bee106ed4a1f07bb8fbe619850b5c" gracePeriod=30 Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.082647 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c515ed89-0f3e-4a37-b5e9-53602578d30a","Type":"ContainerStarted","Data":"727f47daecc45501a4a6d01cae978e9dbf2ec76d6ccf8d08f200c2bc625c6c12"} Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.082748 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-httpd" containerID="cri-o://727f47daecc45501a4a6d01cae978e9dbf2ec76d6ccf8d08f200c2bc625c6c12" gracePeriod=30 Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.098218 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.098201612 podStartE2EDuration="7.098201612s" podCreationTimestamp="2025-09-30 12:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:02.091889771 +0000 UTC m=+1033.361127437" watchObservedRunningTime="2025-09-30 12:39:02.098201612 +0000 UTC m=+1033.367439258" Sep 30 12:39:02 crc kubenswrapper[4672]: I0930 12:39:02.121086 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.121072256 podStartE2EDuration="7.121072256s" podCreationTimestamp="2025-09-30 12:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:02.114806856 +0000 UTC m=+1033.384044502" watchObservedRunningTime="2025-09-30 12:39:02.121072256 +0000 UTC m=+1033.390309902" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.095814 4672 generic.go:334] "Generic (PLEG): container finished" podID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerID="727f47daecc45501a4a6d01cae978e9dbf2ec76d6ccf8d08f200c2bc625c6c12" exitCode=0 Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.096106 4672 generic.go:334] "Generic (PLEG): container finished" podID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerID="50186f10fd640297f77522112529fd398a4bee106ed4a1f07bb8fbe619850b5c" exitCode=143 Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.095904 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c515ed89-0f3e-4a37-b5e9-53602578d30a","Type":"ContainerDied","Data":"727f47daecc45501a4a6d01cae978e9dbf2ec76d6ccf8d08f200c2bc625c6c12"} Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.096228 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c515ed89-0f3e-4a37-b5e9-53602578d30a","Type":"ContainerDied","Data":"50186f10fd640297f77522112529fd398a4bee106ed4a1f07bb8fbe619850b5c"} Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.098313 4672 generic.go:334] "Generic (PLEG): container finished" podID="a3ef8453-d983-4c2a-94b0-c25645f83c23" containerID="3a6ec9c5597a94e6f7c8b9a5c79734eaf39c98e33f98fbae6974f40305f31127" exitCode=0 Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.098370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ss5sv" event={"ID":"a3ef8453-d983-4c2a-94b0-c25645f83c23","Type":"ContainerDied","Data":"3a6ec9c5597a94e6f7c8b9a5c79734eaf39c98e33f98fbae6974f40305f31127"} Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.100665 4672 generic.go:334] "Generic (PLEG): container finished" podID="69e757cd-3187-40f9-828a-cec497ae50dd" containerID="7f1ae11fb0ca93ce8fe36b1259620ff1daf25dc16d12766832eb4aed17a8bd37" exitCode=0 Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.100697 4672 generic.go:334] "Generic (PLEG): container finished" podID="69e757cd-3187-40f9-828a-cec497ae50dd" containerID="ffa109781cc7820232abe1a36d8cace242c9308a6cb74b5bcd957de3302677bd" exitCode=143 Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.100728 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69e757cd-3187-40f9-828a-cec497ae50dd","Type":"ContainerDied","Data":"7f1ae11fb0ca93ce8fe36b1259620ff1daf25dc16d12766832eb4aed17a8bd37"} Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.100758 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69e757cd-3187-40f9-828a-cec497ae50dd","Type":"ContainerDied","Data":"ffa109781cc7820232abe1a36d8cace242c9308a6cb74b5bcd957de3302677bd"} Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.624309 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-2w94k" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.703514 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7sfb\" (UniqueName: \"kubernetes.io/projected/728a5c57-a6b8-4201-bbff-34f2eee54b1a-kube-api-access-c7sfb\") pod \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.703587 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-config-data\") pod \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.703659 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-db-sync-config-data\") pod \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.703701 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-combined-ca-bundle\") pod \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\" (UID: \"728a5c57-a6b8-4201-bbff-34f2eee54b1a\") " Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.710113 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "728a5c57-a6b8-4201-bbff-34f2eee54b1a" (UID: "728a5c57-a6b8-4201-bbff-34f2eee54b1a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.714707 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728a5c57-a6b8-4201-bbff-34f2eee54b1a-kube-api-access-c7sfb" (OuterVolumeSpecName: "kube-api-access-c7sfb") pod "728a5c57-a6b8-4201-bbff-34f2eee54b1a" (UID: "728a5c57-a6b8-4201-bbff-34f2eee54b1a"). InnerVolumeSpecName "kube-api-access-c7sfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.748628 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728a5c57-a6b8-4201-bbff-34f2eee54b1a" (UID: "728a5c57-a6b8-4201-bbff-34f2eee54b1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.764972 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-config-data" (OuterVolumeSpecName: "config-data") pod "728a5c57-a6b8-4201-bbff-34f2eee54b1a" (UID: "728a5c57-a6b8-4201-bbff-34f2eee54b1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.805509 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7sfb\" (UniqueName: \"kubernetes.io/projected/728a5c57-a6b8-4201-bbff-34f2eee54b1a-kube-api-access-c7sfb\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.805555 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.805568 4672 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:03 crc kubenswrapper[4672]: I0930 12:39:03.805576 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728a5c57-a6b8-4201-bbff-34f2eee54b1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.034657 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-llp7f"] Sep 30 12:39:04 crc kubenswrapper[4672]: E0930 12:39:04.039794 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e78aa5-04e3-40fe-839b-b753ff7b6e77" containerName="init" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.039832 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e78aa5-04e3-40fe-839b-b753ff7b6e77" containerName="init" Sep 30 12:39:04 crc kubenswrapper[4672]: E0930 12:39:04.039855 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43cef9cd-4246-4707-93c8-2d1e0b49a99c" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.039864 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="43cef9cd-4246-4707-93c8-2d1e0b49a99c" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: E0930 12:39:04.039890 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" containerName="init" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.039898 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" containerName="init" Sep 30 12:39:04 crc kubenswrapper[4672]: E0930 12:39:04.039912 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728a5c57-a6b8-4201-bbff-34f2eee54b1a" containerName="watcher-db-sync" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.039918 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="728a5c57-a6b8-4201-bbff-34f2eee54b1a" containerName="watcher-db-sync" Sep 30 12:39:04 crc kubenswrapper[4672]: E0930 12:39:04.039926 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d372bb6e-78aa-4329-b0f7-f2685c342bd3" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.039932 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d372bb6e-78aa-4329-b0f7-f2685c342bd3" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: E0930 12:39:04.039950 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.039956 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.040153 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.040167 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="728a5c57-a6b8-4201-bbff-34f2eee54b1a" containerName="watcher-db-sync" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.040180 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f00365-71a5-4e33-bb1a-be3c3ff6ac1f" containerName="init" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.040191 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e78aa5-04e3-40fe-839b-b753ff7b6e77" containerName="init" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.040201 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="43cef9cd-4246-4707-93c8-2d1e0b49a99c" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.040211 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d372bb6e-78aa-4329-b0f7-f2685c342bd3" containerName="mariadb-account-create" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.044009 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.044903 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-llp7f"] Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.046407 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m4shp" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.046559 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.046706 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.131879 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-2w94k" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.131931 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-2w94k" event={"ID":"728a5c57-a6b8-4201-bbff-34f2eee54b1a","Type":"ContainerDied","Data":"6a34361c29c7ed1a9de47c597e1477654b2bec034ecd553ff7c11e76a9657185"} Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.131960 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a34361c29c7ed1a9de47c597e1477654b2bec034ecd553ff7c11e76a9657185" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.245743 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-combined-ca-bundle\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.254069 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-etc-machine-id\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.254178 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-db-sync-config-data\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.254334 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzgds\" (UniqueName: \"kubernetes.io/projected/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-kube-api-access-tzgds\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.254403 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-scripts\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.254437 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-config-data\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.301241 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9cnkc"] Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.302423 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.308658 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.308693 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w4tr6" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.312489 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9cnkc"] Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.355827 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-combined-ca-bundle\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.355904 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-etc-machine-id\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.355938 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57c2w\" (UniqueName: \"kubernetes.io/projected/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-kube-api-access-57c2w\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.355970 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-db-sync-config-data\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.356001 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-db-sync-config-data\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.356059 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzgds\" (UniqueName: \"kubernetes.io/projected/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-kube-api-access-tzgds\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.356098 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-combined-ca-bundle\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.356115 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-scripts\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.356138 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-config-data\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.360567 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-etc-machine-id\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.364523 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-scripts\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.373158 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-db-sync-config-data\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.374013 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-combined-ca-bundle\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.380808 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-config-data\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.385778 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzgds\" (UniqueName: \"kubernetes.io/projected/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-kube-api-access-tzgds\") pod \"cinder-db-sync-llp7f\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.461257 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rd9t8"] Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.462586 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.463844 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-combined-ca-bundle\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.464160 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57c2w\" (UniqueName: \"kubernetes.io/projected/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-kube-api-access-57c2w\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.464351 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-db-sync-config-data\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.465631 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xnjsm" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.469773 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-combined-ca-bundle\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.470406 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.470430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-db-sync-config-data\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.470472 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.489059 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rd9t8"] Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.491858 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57c2w\" (UniqueName: \"kubernetes.io/projected/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-kube-api-access-57c2w\") pod \"barbican-db-sync-9cnkc\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.566240 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-combined-ca-bundle\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.566318 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49b65\" (UniqueName: \"kubernetes.io/projected/db83994c-a577-4d20-a544-3950abb7273b-kube-api-access-49b65\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.566334 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-config\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.624408 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.660793 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-llp7f" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.667865 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-combined-ca-bundle\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.667927 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49b65\" (UniqueName: \"kubernetes.io/projected/db83994c-a577-4d20-a544-3950abb7273b-kube-api-access-49b65\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.667953 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-config\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.683083 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-combined-ca-bundle\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.689412 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-config\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.696220 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49b65\" (UniqueName: \"kubernetes.io/projected/db83994c-a577-4d20-a544-3950abb7273b-kube-api-access-49b65\") pod \"neutron-db-sync-rd9t8\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.793335 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.936668 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.938530 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.946971 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-cndwx" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.951212 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.966060 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.986586 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nw2h\" (UniqueName: \"kubernetes.io/projected/916c2fc8-25b0-4a10-82d3-6b3e51785690-kube-api-access-5nw2h\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.986664 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.986683 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916c2fc8-25b0-4a10-82d3-6b3e51785690-logs\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.986868 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:04 crc kubenswrapper[4672]: I0930 12:39:04.986907 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-config-data\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.007823 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.009279 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.011357 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.025285 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.035656 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.041177 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.049384 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.085655 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.092703 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjg6\" (UniqueName: \"kubernetes.io/projected/0944504c-77dc-42f3-a981-723fea76118c-kube-api-access-ssjg6\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.092803 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8j6t\" (UniqueName: \"kubernetes.io/projected/4f1bee84-650b-4f0b-a657-e6701ee51823-kube-api-access-f8j6t\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.092852 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.092929 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.093014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nw2h\" (UniqueName: \"kubernetes.io/projected/916c2fc8-25b0-4a10-82d3-6b3e51785690-kube-api-access-5nw2h\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.093479 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.093535 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916c2fc8-25b0-4a10-82d3-6b3e51785690-logs\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.093617 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0944504c-77dc-42f3-a981-723fea76118c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.094210 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916c2fc8-25b0-4a10-82d3-6b3e51785690-logs\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.094487 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.094545 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0944504c-77dc-42f3-a981-723fea76118c-logs\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.094573 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0944504c-77dc-42f3-a981-723fea76118c-config-data\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.094632 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-config-data\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.094665 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f1bee84-650b-4f0b-a657-e6701ee51823-logs\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.094783 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.100226 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.100869 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-config-data\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.101291 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.111438 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nw2h\" (UniqueName: \"kubernetes.io/projected/916c2fc8-25b0-4a10-82d3-6b3e51785690-kube-api-access-5nw2h\") pod \"watcher-api-0\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.196559 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0944504c-77dc-42f3-a981-723fea76118c-logs\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197137 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0944504c-77dc-42f3-a981-723fea76118c-config-data\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197188 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f1bee84-650b-4f0b-a657-e6701ee51823-logs\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197247 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197321 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjg6\" (UniqueName: \"kubernetes.io/projected/0944504c-77dc-42f3-a981-723fea76118c-kube-api-access-ssjg6\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197358 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8j6t\" (UniqueName: \"kubernetes.io/projected/4f1bee84-650b-4f0b-a657-e6701ee51823-kube-api-access-f8j6t\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197386 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197426 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.197654 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0944504c-77dc-42f3-a981-723fea76118c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.198925 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0944504c-77dc-42f3-a981-723fea76118c-logs\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.199275 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f1bee84-650b-4f0b-a657-e6701ee51823-logs\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.203022 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0944504c-77dc-42f3-a981-723fea76118c-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.203195 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.208524 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0944504c-77dc-42f3-a981-723fea76118c-config-data\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.220479 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjg6\" (UniqueName: \"kubernetes.io/projected/0944504c-77dc-42f3-a981-723fea76118c-kube-api-access-ssjg6\") pod \"watcher-applier-0\" (UID: \"0944504c-77dc-42f3-a981-723fea76118c\") " pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.223134 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.233140 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1bee84-650b-4f0b-a657-e6701ee51823-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.233523 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8j6t\" (UniqueName: \"kubernetes.io/projected/4f1bee84-650b-4f0b-a657-e6701ee51823-kube-api-access-f8j6t\") pod \"watcher-decision-engine-0\" (UID: \"4f1bee84-650b-4f0b-a657-e6701ee51823\") " pod="openstack/watcher-decision-engine-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.265066 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.326490 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 12:39:05 crc kubenswrapper[4672]: I0930 12:39:05.381425 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 12:39:06 crc kubenswrapper[4672]: I0930 12:39:06.439520 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:39:06 crc kubenswrapper[4672]: I0930 12:39:06.541707 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcd88f6c5-l7hjg"] Sep 30 12:39:06 crc kubenswrapper[4672]: I0930 12:39:06.541987 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="dnsmasq-dns" containerID="cri-o://48aa12cd375c302528a88918d0cb09ef65b3c49af123b8758dbbb074de3bae10" gracePeriod=10 Sep 30 12:39:06 crc kubenswrapper[4672]: I0930 12:39:06.561741 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.175861 4672 generic.go:334] "Generic (PLEG): container finished" podID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerID="48aa12cd375c302528a88918d0cb09ef65b3c49af123b8758dbbb074de3bae10" exitCode=0 Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.176166 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" event={"ID":"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9","Type":"ContainerDied","Data":"48aa12cd375c302528a88918d0cb09ef65b3c49af123b8758dbbb074de3bae10"} Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.403983 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f749887b9-hl4rd"] Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.439176 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8bdf69cc8-lsxz6"] Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.448013 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.453834 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8bdf69cc8-lsxz6"] Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.453920 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.528646 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-567d7b57-5g9zw"] Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.549722 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-tls-certs\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.549809 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-secret-key\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.549855 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e153eb6-5f25-4214-8e8a-14c37a36fc06-logs\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.549884 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-combined-ca-bundle\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.549923 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-scripts\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.549982 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jf5w\" (UniqueName: \"kubernetes.io/projected/4e153eb6-5f25-4214-8e8a-14c37a36fc06-kube-api-access-7jf5w\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.550021 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-config-data\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.580183 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-844b6c9474-6tpzt"] Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.582106 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.598910 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844b6c9474-6tpzt"] Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651337 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-combined-ca-bundle\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651396 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-horizon-tls-certs\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651427 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-tls-certs\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651458 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj24w\" (UniqueName: \"kubernetes.io/projected/2659b35e-ecb1-416b-8a94-690759645536-kube-api-access-wj24w\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651484 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-secret-key\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651514 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e153eb6-5f25-4214-8e8a-14c37a36fc06-logs\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651531 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-combined-ca-bundle\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651558 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-horizon-secret-key\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651578 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659b35e-ecb1-416b-8a94-690759645536-logs\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651593 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-scripts\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651635 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2659b35e-ecb1-416b-8a94-690759645536-config-data\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651655 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jf5w\" (UniqueName: \"kubernetes.io/projected/4e153eb6-5f25-4214-8e8a-14c37a36fc06-kube-api-access-7jf5w\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-config-data\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.651696 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2659b35e-ecb1-416b-8a94-690759645536-scripts\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.652963 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e153eb6-5f25-4214-8e8a-14c37a36fc06-logs\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.656188 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-scripts\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.659296 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-tls-certs\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.659962 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-config-data\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.675665 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-combined-ca-bundle\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.680879 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-secret-key\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.688651 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jf5w\" (UniqueName: \"kubernetes.io/projected/4e153eb6-5f25-4214-8e8a-14c37a36fc06-kube-api-access-7jf5w\") pod \"horizon-8bdf69cc8-lsxz6\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753295 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2659b35e-ecb1-416b-8a94-690759645536-config-data\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753351 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2659b35e-ecb1-416b-8a94-690759645536-scripts\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753405 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-combined-ca-bundle\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753435 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-horizon-tls-certs\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753472 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj24w\" (UniqueName: \"kubernetes.io/projected/2659b35e-ecb1-416b-8a94-690759645536-kube-api-access-wj24w\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753572 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-horizon-secret-key\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753595 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659b35e-ecb1-416b-8a94-690759645536-logs\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.753912 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2659b35e-ecb1-416b-8a94-690759645536-logs\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.754737 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2659b35e-ecb1-416b-8a94-690759645536-scripts\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.758089 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2659b35e-ecb1-416b-8a94-690759645536-config-data\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.758323 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-combined-ca-bundle\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.758552 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-horizon-tls-certs\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.759183 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2659b35e-ecb1-416b-8a94-690759645536-horizon-secret-key\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.772212 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj24w\" (UniqueName: \"kubernetes.io/projected/2659b35e-ecb1-416b-8a94-690759645536-kube-api-access-wj24w\") pod \"horizon-844b6c9474-6tpzt\" (UID: \"2659b35e-ecb1-416b-8a94-690759645536\") " pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:07 crc kubenswrapper[4672]: I0930 12:39:07.775489 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:08 crc kubenswrapper[4672]: I0930 12:39:08.044060 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.032962 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.223048 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpv54\" (UniqueName: \"kubernetes.io/projected/a3ef8453-d983-4c2a-94b0-c25645f83c23-kube-api-access-bpv54\") pod \"a3ef8453-d983-4c2a-94b0-c25645f83c23\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.223121 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-credential-keys\") pod \"a3ef8453-d983-4c2a-94b0-c25645f83c23\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.223203 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-combined-ca-bundle\") pod \"a3ef8453-d983-4c2a-94b0-c25645f83c23\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.223248 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-fernet-keys\") pod \"a3ef8453-d983-4c2a-94b0-c25645f83c23\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.223337 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-scripts\") pod \"a3ef8453-d983-4c2a-94b0-c25645f83c23\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.223441 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-config-data\") pod \"a3ef8453-d983-4c2a-94b0-c25645f83c23\" (UID: \"a3ef8453-d983-4c2a-94b0-c25645f83c23\") " Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.231179 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-scripts" (OuterVolumeSpecName: "scripts") pod "a3ef8453-d983-4c2a-94b0-c25645f83c23" (UID: "a3ef8453-d983-4c2a-94b0-c25645f83c23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.232493 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a3ef8453-d983-4c2a-94b0-c25645f83c23" (UID: "a3ef8453-d983-4c2a-94b0-c25645f83c23"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.235286 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ef8453-d983-4c2a-94b0-c25645f83c23-kube-api-access-bpv54" (OuterVolumeSpecName: "kube-api-access-bpv54") pod "a3ef8453-d983-4c2a-94b0-c25645f83c23" (UID: "a3ef8453-d983-4c2a-94b0-c25645f83c23"). InnerVolumeSpecName "kube-api-access-bpv54". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.235540 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ss5sv" event={"ID":"a3ef8453-d983-4c2a-94b0-c25645f83c23","Type":"ContainerDied","Data":"a10376c8bde6edabec9e5e9593fdc1fce5625725f0014366dd8c6a1481a2e899"} Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.235645 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a10376c8bde6edabec9e5e9593fdc1fce5625725f0014366dd8c6a1481a2e899" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.235703 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ss5sv" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.247463 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a3ef8453-d983-4c2a-94b0-c25645f83c23" (UID: "a3ef8453-d983-4c2a-94b0-c25645f83c23"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.257110 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3ef8453-d983-4c2a-94b0-c25645f83c23" (UID: "a3ef8453-d983-4c2a-94b0-c25645f83c23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.262683 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-config-data" (OuterVolumeSpecName: "config-data") pod "a3ef8453-d983-4c2a-94b0-c25645f83c23" (UID: "a3ef8453-d983-4c2a-94b0-c25645f83c23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.325658 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.325692 4672 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.325701 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.325709 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.325717 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpv54\" (UniqueName: \"kubernetes.io/projected/a3ef8453-d983-4c2a-94b0-c25645f83c23-kube-api-access-bpv54\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:11 crc kubenswrapper[4672]: I0930 12:39:11.325727 4672 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a3ef8453-d983-4c2a-94b0-c25645f83c23-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.107878 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ss5sv"] Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.119727 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ss5sv"] Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.209769 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-stw4v"] Sep 30 12:39:12 crc kubenswrapper[4672]: E0930 12:39:12.210224 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ef8453-d983-4c2a-94b0-c25645f83c23" containerName="keystone-bootstrap" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.210237 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ef8453-d983-4c2a-94b0-c25645f83c23" containerName="keystone-bootstrap" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.210424 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ef8453-d983-4c2a-94b0-c25645f83c23" containerName="keystone-bootstrap" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.211062 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.214594 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.214741 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nbwb" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.214880 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.215006 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.215779 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-stw4v"] Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.349509 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-credential-keys\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.349585 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-config-data\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.349654 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-combined-ca-bundle\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.349681 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbh6\" (UniqueName: \"kubernetes.io/projected/8c431961-0987-401e-8486-c2dd0887721b-kube-api-access-rmbh6\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.349725 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-fernet-keys\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.349780 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-scripts\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.450950 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-config-data\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.451011 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-combined-ca-bundle\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.451030 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbh6\" (UniqueName: \"kubernetes.io/projected/8c431961-0987-401e-8486-c2dd0887721b-kube-api-access-rmbh6\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.451060 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-fernet-keys\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.451097 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-scripts\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.451159 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-credential-keys\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.456025 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-scripts\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.456100 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-combined-ca-bundle\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.457568 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-credential-keys\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.459830 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-fernet-keys\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.469655 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-config-data\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.486924 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbh6\" (UniqueName: \"kubernetes.io/projected/8c431961-0987-401e-8486-c2dd0887721b-kube-api-access-rmbh6\") pod \"keystone-bootstrap-stw4v\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:12 crc kubenswrapper[4672]: I0930 12:39:12.554400 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:13 crc kubenswrapper[4672]: I0930 12:39:13.428609 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ef8453-d983-4c2a-94b0-c25645f83c23" path="/var/lib/kubelet/pods/a3ef8453-d983-4c2a-94b0-c25645f83c23/volumes" Sep 30 12:39:16 crc kubenswrapper[4672]: E0930 12:39:16.032470 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Sep 30 12:39:16 crc kubenswrapper[4672]: E0930 12:39:16.036464 4672 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Sep 30 12:39:16 crc kubenswrapper[4672]: E0930 12:39:16.036622 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.102.83.83:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2sb2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-9kp7h_openstack(c87bb11c-a04e-4018-ba41-d628795a926e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 12:39:16 crc kubenswrapper[4672]: E0930 12:39:16.037793 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-9kp7h" podUID="c87bb11c-a04e-4018-ba41-d628795a926e" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.323689 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c515ed89-0f3e-4a37-b5e9-53602578d30a","Type":"ContainerDied","Data":"c7bea0ac4dcd141881ba7b6c29e5ed3daf48b2aac302acee9c45eb7ca2e44c2b"} Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.323958 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7bea0ac4dcd141881ba7b6c29e5ed3daf48b2aac302acee9c45eb7ca2e44c2b" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.331912 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"69e757cd-3187-40f9-828a-cec497ae50dd","Type":"ContainerDied","Data":"0b89405dab5502bb0806392b897994c53d588e27db7c354bc5d04573781464f1"} Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.331969 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b89405dab5502bb0806392b897994c53d588e27db7c354bc5d04573781464f1" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.333597 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" event={"ID":"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9","Type":"ContainerDied","Data":"92aaeccd6de8cec79d0e74405638807fc0a2003a87299f42d3e40a1c18614d6d"} Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.333644 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92aaeccd6de8cec79d0e74405638807fc0a2003a87299f42d3e40a1c18614d6d" Sep 30 12:39:16 crc kubenswrapper[4672]: E0930 12:39:16.370583 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.83:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-9kp7h" podUID="c87bb11c-a04e-4018-ba41-d628795a926e" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.392626 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.402256 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.422296 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.427840 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jppd5\" (UniqueName: \"kubernetes.io/projected/69e757cd-3187-40f9-828a-cec497ae50dd-kube-api-access-jppd5\") pod \"69e757cd-3187-40f9-828a-cec497ae50dd\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.427885 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-logs\") pod \"c515ed89-0f3e-4a37-b5e9-53602578d30a\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.427969 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"69e757cd-3187-40f9-828a-cec497ae50dd\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.427998 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-combined-ca-bundle\") pod \"c515ed89-0f3e-4a37-b5e9-53602578d30a\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.428043 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-swift-storage-0\") pod \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.428065 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-combined-ca-bundle\") pod \"69e757cd-3187-40f9-828a-cec497ae50dd\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.428086 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96bqx\" (UniqueName: \"kubernetes.io/projected/c515ed89-0f3e-4a37-b5e9-53602578d30a-kube-api-access-96bqx\") pod \"c515ed89-0f3e-4a37-b5e9-53602578d30a\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.428114 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-svc\") pod \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.428135 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-httpd-run\") pod \"69e757cd-3187-40f9-828a-cec497ae50dd\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429315 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-config-data\") pod \"69e757cd-3187-40f9-828a-cec497ae50dd\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429369 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-config\") pod \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429387 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpntp\" (UniqueName: \"kubernetes.io/projected/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-kube-api-access-tpntp\") pod \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429410 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c515ed89-0f3e-4a37-b5e9-53602578d30a\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429426 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-httpd-run\") pod \"c515ed89-0f3e-4a37-b5e9-53602578d30a\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429477 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-scripts\") pod \"c515ed89-0f3e-4a37-b5e9-53602578d30a\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429494 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-sb\") pod \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429511 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-nb\") pod \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\" (UID: \"a2fff124-9dc4-46ee-a5fa-9cef98b3eed9\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429538 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-scripts\") pod \"69e757cd-3187-40f9-828a-cec497ae50dd\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429560 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-config-data\") pod \"c515ed89-0f3e-4a37-b5e9-53602578d30a\" (UID: \"c515ed89-0f3e-4a37-b5e9-53602578d30a\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.429590 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-logs\") pod \"69e757cd-3187-40f9-828a-cec497ae50dd\" (UID: \"69e757cd-3187-40f9-828a-cec497ae50dd\") " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.428733 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-logs" (OuterVolumeSpecName: "logs") pod "c515ed89-0f3e-4a37-b5e9-53602578d30a" (UID: "c515ed89-0f3e-4a37-b5e9-53602578d30a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.433924 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c515ed89-0f3e-4a37-b5e9-53602578d30a" (UID: "c515ed89-0f3e-4a37-b5e9-53602578d30a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.436860 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "69e757cd-3187-40f9-828a-cec497ae50dd" (UID: "69e757cd-3187-40f9-828a-cec497ae50dd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.450987 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-logs" (OuterVolumeSpecName: "logs") pod "69e757cd-3187-40f9-828a-cec497ae50dd" (UID: "69e757cd-3187-40f9-828a-cec497ae50dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.451989 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-kube-api-access-tpntp" (OuterVolumeSpecName: "kube-api-access-tpntp") pod "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" (UID: "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9"). InnerVolumeSpecName "kube-api-access-tpntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.452495 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-scripts" (OuterVolumeSpecName: "scripts") pod "c515ed89-0f3e-4a37-b5e9-53602578d30a" (UID: "c515ed89-0f3e-4a37-b5e9-53602578d30a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.455209 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "69e757cd-3187-40f9-828a-cec497ae50dd" (UID: "69e757cd-3187-40f9-828a-cec497ae50dd"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.461190 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e757cd-3187-40f9-828a-cec497ae50dd-kube-api-access-jppd5" (OuterVolumeSpecName: "kube-api-access-jppd5") pod "69e757cd-3187-40f9-828a-cec497ae50dd" (UID: "69e757cd-3187-40f9-828a-cec497ae50dd"). InnerVolumeSpecName "kube-api-access-jppd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.462776 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c515ed89-0f3e-4a37-b5e9-53602578d30a-kube-api-access-96bqx" (OuterVolumeSpecName: "kube-api-access-96bqx") pod "c515ed89-0f3e-4a37-b5e9-53602578d30a" (UID: "c515ed89-0f3e-4a37-b5e9-53602578d30a"). InnerVolumeSpecName "kube-api-access-96bqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.465969 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-scripts" (OuterVolumeSpecName: "scripts") pod "69e757cd-3187-40f9-828a-cec497ae50dd" (UID: "69e757cd-3187-40f9-828a-cec497ae50dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.496338 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c515ed89-0f3e-4a37-b5e9-53602578d30a" (UID: "c515ed89-0f3e-4a37-b5e9-53602578d30a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540066 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540118 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540131 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540149 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jppd5\" (UniqueName: \"kubernetes.io/projected/69e757cd-3187-40f9-828a-cec497ae50dd-kube-api-access-jppd5\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540163 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540199 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540218 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96bqx\" (UniqueName: \"kubernetes.io/projected/c515ed89-0f3e-4a37-b5e9-53602578d30a-kube-api-access-96bqx\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540231 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69e757cd-3187-40f9-828a-cec497ae50dd-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540247 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpntp\" (UniqueName: \"kubernetes.io/projected/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-kube-api-access-tpntp\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540280 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.540292 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c515ed89-0f3e-4a37-b5e9-53602578d30a-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.562543 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.565626 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69e757cd-3187-40f9-828a-cec497ae50dd" (UID: "69e757cd-3187-40f9-828a-cec497ae50dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.569570 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c515ed89-0f3e-4a37-b5e9-53602578d30a" (UID: "c515ed89-0f3e-4a37-b5e9-53602578d30a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.573674 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rd9t8"] Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.585222 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.588431 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.617681 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-config-data" (OuterVolumeSpecName: "config-data") pod "69e757cd-3187-40f9-828a-cec497ae50dd" (UID: "69e757cd-3187-40f9-828a-cec497ae50dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.621400 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" (UID: "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.627602 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" (UID: "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.647646 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.647711 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.647726 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.647737 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.647750 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.647765 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e757cd-3187-40f9-828a-cec497ae50dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.647775 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.649204 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" (UID: "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.664967 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-config-data" (OuterVolumeSpecName: "config-data") pod "c515ed89-0f3e-4a37-b5e9-53602578d30a" (UID: "c515ed89-0f3e-4a37-b5e9-53602578d30a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.667534 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-config" (OuterVolumeSpecName: "config") pod "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" (UID: "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.682805 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" (UID: "a2fff124-9dc4-46ee-a5fa-9cef98b3eed9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.749399 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.749432 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.749442 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.749451 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c515ed89-0f3e-4a37-b5e9-53602578d30a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.859837 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9cnkc"] Sep 30 12:39:16 crc kubenswrapper[4672]: I0930 12:39:16.868639 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 12:39:16 crc kubenswrapper[4672]: W0930 12:39:16.870703 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd637de0b_aed1_45ef_9d86_c6f7c2f188e1.slice/crio-7f0da9ba6f558c390034153bf121c1b68733ce24abfd4c3da5aa859fff0b2b5d WatchSource:0}: Error finding container 7f0da9ba6f558c390034153bf121c1b68733ce24abfd4c3da5aa859fff0b2b5d: Status 404 returned error can't find the container with id 7f0da9ba6f558c390034153bf121c1b68733ce24abfd4c3da5aa859fff0b2b5d Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.139192 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.150905 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: W0930 12:39:17.157483 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod916c2fc8_25b0_4a10_82d3_6b3e51785690.slice/crio-05afdccf7785943f6cb29d0d4db57b287a304f04daccab6e6e575da3ea1622f7 WatchSource:0}: Error finding container 05afdccf7785943f6cb29d0d4db57b287a304f04daccab6e6e575da3ea1622f7: Status 404 returned error can't find the container with id 05afdccf7785943f6cb29d0d4db57b287a304f04daccab6e6e575da3ea1622f7 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.158992 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8bdf69cc8-lsxz6"] Sep 30 12:39:17 crc kubenswrapper[4672]: W0930 12:39:17.162910 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f1bee84_650b_4f0b_a657_e6701ee51823.slice/crio-6a386f8b0dd02ccce4598196ef517f41c5d961ad1243b1858ab6d701a41be982 WatchSource:0}: Error finding container 6a386f8b0dd02ccce4598196ef517f41c5d961ad1243b1858ab6d701a41be982: Status 404 returned error can't find the container with id 6a386f8b0dd02ccce4598196ef517f41c5d961ad1243b1858ab6d701a41be982 Sep 30 12:39:17 crc kubenswrapper[4672]: W0930 12:39:17.204814 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e153eb6_5f25_4214_8e8a_14c37a36fc06.slice/crio-5d17aeee535d5ed1e97a0ffba135cdb2d653d3c559c871ce2f79cc6be6849d75 WatchSource:0}: Error finding container 5d17aeee535d5ed1e97a0ffba135cdb2d653d3c559c871ce2f79cc6be6849d75: Status 404 returned error can't find the container with id 5d17aeee535d5ed1e97a0ffba135cdb2d653d3c559c871ce2f79cc6be6849d75 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.357191 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerStarted","Data":"af6331668bdba254199db134ae8885d681bf81141ac6733dbfa4d255be8e9c0e"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.362096 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567d7b57-5g9zw" event={"ID":"96f92d5d-ba87-4de0-955d-998845ea9010","Type":"ContainerStarted","Data":"586c3bebe202fe92c0504e86b49cf04794ba5b560a39275854f3a1033b6a055d"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.362138 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567d7b57-5g9zw" event={"ID":"96f92d5d-ba87-4de0-955d-998845ea9010","Type":"ContainerStarted","Data":"6fe46b3cd368d55265ed69ecd7caa903a2d78380fec92da12e77686aa3dddc3a"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.362310 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-567d7b57-5g9zw" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon-log" containerID="cri-o://6fe46b3cd368d55265ed69ecd7caa903a2d78380fec92da12e77686aa3dddc3a" gracePeriod=30 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.362779 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-567d7b57-5g9zw" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon" containerID="cri-o://586c3bebe202fe92c0504e86b49cf04794ba5b560a39275854f3a1033b6a055d" gracePeriod=30 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.370797 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548589fd89-v67ld" event={"ID":"8dd596d1-129e-4bb2-9e9b-dac1d09323d2","Type":"ContainerStarted","Data":"9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.370839 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548589fd89-v67ld" event={"ID":"8dd596d1-129e-4bb2-9e9b-dac1d09323d2","Type":"ContainerStarted","Data":"dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.370946 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-548589fd89-v67ld" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon-log" containerID="cri-o://dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1" gracePeriod=30 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.371030 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-548589fd89-v67ld" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon" containerID="cri-o://9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af" gracePeriod=30 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.375063 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8bdf69cc8-lsxz6" event={"ID":"4e153eb6-5f25-4214-8e8a-14c37a36fc06","Type":"ContainerStarted","Data":"5d17aeee535d5ed1e97a0ffba135cdb2d653d3c559c871ce2f79cc6be6849d75"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.386292 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rd9t8" event={"ID":"db83994c-a577-4d20-a544-3950abb7273b","Type":"ContainerStarted","Data":"2ff2e0394a73d16946196cb54745bc30f70ac4238c5fc730f22b5d9ec507751f"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.386509 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rd9t8" event={"ID":"db83994c-a577-4d20-a544-3950abb7273b","Type":"ContainerStarted","Data":"37b877bf0bb61bf00bbd38e8ea18b6d0da92c4aee9595c3e5d776fcecd51cf63"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.394688 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f749887b9-hl4rd" event={"ID":"4dcfda15-b815-4733-b3c7-0312473f7355","Type":"ContainerStarted","Data":"2ca40cc4750bf50f77b8f00e761b61a2c75ac13af658d7a70bb22c83ab627d31"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.394741 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f749887b9-hl4rd" event={"ID":"4dcfda15-b815-4733-b3c7-0312473f7355","Type":"ContainerStarted","Data":"501b2136d15d3cf240eff2b922fac22c611eb5f0c9b5a54fd75a073a669049c1"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.394781 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f749887b9-hl4rd" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon-log" containerID="cri-o://501b2136d15d3cf240eff2b922fac22c611eb5f0c9b5a54fd75a073a669049c1" gracePeriod=30 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.394809 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f749887b9-hl4rd" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon" containerID="cri-o://2ca40cc4750bf50f77b8f00e761b61a2c75ac13af658d7a70bb22c83ab627d31" gracePeriod=30 Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.394970 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-567d7b57-5g9zw" podStartSLOduration=3.042411787 podStartE2EDuration="20.394951529s" podCreationTimestamp="2025-09-30 12:38:57 +0000 UTC" firstStartedPulling="2025-09-30 12:38:59.048913642 +0000 UTC m=+1030.318151288" lastFinishedPulling="2025-09-30 12:39:16.401453384 +0000 UTC m=+1047.670691030" observedRunningTime="2025-09-30 12:39:17.382339927 +0000 UTC m=+1048.651577593" watchObservedRunningTime="2025-09-30 12:39:17.394951529 +0000 UTC m=+1048.664189175" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.405699 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cnkc" event={"ID":"d637de0b-aed1-45ef-9d86-c6f7c2f188e1","Type":"ContainerStarted","Data":"7f0da9ba6f558c390034153bf121c1b68733ce24abfd4c3da5aa859fff0b2b5d"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.417372 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-548589fd89-v67ld" podStartSLOduration=3.642342331 podStartE2EDuration="23.417346611s" podCreationTimestamp="2025-09-30 12:38:54 +0000 UTC" firstStartedPulling="2025-09-30 12:38:56.625850579 +0000 UTC m=+1027.895088225" lastFinishedPulling="2025-09-30 12:39:16.400854859 +0000 UTC m=+1047.670092505" observedRunningTime="2025-09-30 12:39:17.410391444 +0000 UTC m=+1048.679629110" watchObservedRunningTime="2025-09-30 12:39:17.417346611 +0000 UTC m=+1048.686584257" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.434231 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"916c2fc8-25b0-4a10-82d3-6b3e51785690","Type":"ContainerStarted","Data":"05afdccf7785943f6cb29d0d4db57b287a304f04daccab6e6e575da3ea1622f7"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.434301 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerStarted","Data":"6a386f8b0dd02ccce4598196ef517f41c5d961ad1243b1858ab6d701a41be982"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.436179 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f749887b9-hl4rd" podStartSLOduration=4.179627349 podStartE2EDuration="23.436156621s" podCreationTimestamp="2025-09-30 12:38:54 +0000 UTC" firstStartedPulling="2025-09-30 12:38:56.921479637 +0000 UTC m=+1028.190717283" lastFinishedPulling="2025-09-30 12:39:16.178008909 +0000 UTC m=+1047.447246555" observedRunningTime="2025-09-30 12:39:17.434015787 +0000 UTC m=+1048.703253433" watchObservedRunningTime="2025-09-30 12:39:17.436156621 +0000 UTC m=+1048.705394277" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.441504 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcd88f6c5-l7hjg" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.442648 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0944504c-77dc-42f3-a981-723fea76118c","Type":"ContainerStarted","Data":"ebb168f661c8037e73dfa40b2a876d1ab7092d5af1c8d736031a3386f8e85354"} Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.442751 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.443448 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.526487 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rd9t8" podStartSLOduration=13.526460947 podStartE2EDuration="13.526460947s" podCreationTimestamp="2025-09-30 12:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:17.451206386 +0000 UTC m=+1048.720444022" watchObservedRunningTime="2025-09-30 12:39:17.526460947 +0000 UTC m=+1048.795698593" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.606254 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844b6c9474-6tpzt"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.623405 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-llp7f"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.648373 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-stw4v"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.656117 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.670884 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.687562 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.698888 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.709225 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: E0930 12:39:17.709865 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-log" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.709895 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-log" Sep 30 12:39:17 crc kubenswrapper[4672]: E0930 12:39:17.710617 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="dnsmasq-dns" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.710655 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="dnsmasq-dns" Sep 30 12:39:17 crc kubenswrapper[4672]: E0930 12:39:17.710668 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-httpd" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.710675 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-httpd" Sep 30 12:39:17 crc kubenswrapper[4672]: E0930 12:39:17.710701 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="init" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.710708 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="init" Sep 30 12:39:17 crc kubenswrapper[4672]: E0930 12:39:17.710742 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-log" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.710751 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-log" Sep 30 12:39:17 crc kubenswrapper[4672]: E0930 12:39:17.710770 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-httpd" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.710777 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-httpd" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.711036 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-log" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.711064 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" containerName="glance-httpd" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.711754 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-httpd" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.711782 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" containerName="glance-log" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.711799 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" containerName="dnsmasq-dns" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.713347 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.721418 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.721424 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.722415 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8npkv" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.723400 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.724125 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.734334 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.736011 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.738854 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.740401 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.741795 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcd88f6c5-l7hjg"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.751904 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.761347 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcd88f6c5-l7hjg"] Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.787294 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.787406 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.788536 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.788578 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.788598 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-logs\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.788697 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.788809 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghck\" (UniqueName: \"kubernetes.io/projected/0e1e7299-4168-4aba-917d-dea25752e400-kube-api-access-bghck\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.788874 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.894571 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-logs\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.895244 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.895383 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.895551 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.897987 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898043 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-logs\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898178 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898212 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898239 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghck\" (UniqueName: \"kubernetes.io/projected/0e1e7299-4168-4aba-917d-dea25752e400-kube-api-access-bghck\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898305 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898338 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898396 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwxb\" (UniqueName: \"kubernetes.io/projected/74285663-1d6c-4a5f-8fe4-5406010f7ad7-kube-api-access-tkwxb\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898442 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898693 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-logs\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898724 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.898777 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.899127 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.900200 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.918124 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghck\" (UniqueName: \"kubernetes.io/projected/0e1e7299-4168-4aba-917d-dea25752e400-kube-api-access-bghck\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.919016 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.919497 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.923960 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:17 crc kubenswrapper[4672]: I0930 12:39:17.930986 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002037 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-logs\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002663 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-logs\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002672 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002729 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002754 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002837 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002869 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.002914 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwxb\" (UniqueName: \"kubernetes.io/projected/74285663-1d6c-4a5f-8fe4-5406010f7ad7-kube-api-access-tkwxb\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.003084 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.003833 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.006976 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.013427 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.015987 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.022654 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.030335 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwxb\" (UniqueName: \"kubernetes.io/projected/74285663-1d6c-4a5f-8fe4-5406010f7ad7-kube-api-access-tkwxb\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.033384 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.061061 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.107303 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " pod="openstack/glance-default-external-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.191885 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.373785 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.415724 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.469654 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-llp7f" event={"ID":"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a","Type":"ContainerStarted","Data":"6259c9fe43696eb4e5fa556d709e9d949210c01c9d2a0f15decec09cb95b30da"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.471914 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"916c2fc8-25b0-4a10-82d3-6b3e51785690","Type":"ContainerStarted","Data":"51b2ccd237cd08254100818a64ed4d4867f98c2027d73c962717bddcf02ea835"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.471984 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"916c2fc8-25b0-4a10-82d3-6b3e51785690","Type":"ContainerStarted","Data":"cd85d5296d5ee4279afa60f9335efff440ce347bd25c290bad82a98794365e84"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.473598 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.477395 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8bdf69cc8-lsxz6" event={"ID":"4e153eb6-5f25-4214-8e8a-14c37a36fc06","Type":"ContainerStarted","Data":"66b370cdbebbcc3bf0abe4218b26bfdbdc8a2747d1cda48afa909303b72ead33"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.477450 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8bdf69cc8-lsxz6" event={"ID":"4e153eb6-5f25-4214-8e8a-14c37a36fc06","Type":"ContainerStarted","Data":"b9bc69dd2b46d0c9dba498dbc9b87ca058be1c0ccc1ce3bb2e23ba57697d55b3"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.488636 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b6c9474-6tpzt" event={"ID":"2659b35e-ecb1-416b-8a94-690759645536","Type":"ContainerStarted","Data":"57634cc4becabc470e3cc945d087760e176e7094e6cdba6d3bfc79a113bcbadc"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.488688 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b6c9474-6tpzt" event={"ID":"2659b35e-ecb1-416b-8a94-690759645536","Type":"ContainerStarted","Data":"3a90a49f2d7cb645f2dbc3926c3473b406a0af08fa38ebb26861cc8c4362af4f"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.500916 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=14.500898384 podStartE2EDuration="14.500898384s" podCreationTimestamp="2025-09-30 12:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:18.495576078 +0000 UTC m=+1049.764813724" watchObservedRunningTime="2025-09-30 12:39:18.500898384 +0000 UTC m=+1049.770136030" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.509241 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-stw4v" event={"ID":"8c431961-0987-401e-8486-c2dd0887721b","Type":"ContainerStarted","Data":"de810467bd2c80cebb514631aa663893dcb863c7f06ea2fde6b0efb04a9da6bf"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.509289 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-stw4v" event={"ID":"8c431961-0987-401e-8486-c2dd0887721b","Type":"ContainerStarted","Data":"75ad1ebd64826d5673bd55b2276ddf8dbe9a5c8e246026c01acc0d882feded16"} Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.516139 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8bdf69cc8-lsxz6" podStartSLOduration=11.516125563 podStartE2EDuration="11.516125563s" podCreationTimestamp="2025-09-30 12:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:18.515428465 +0000 UTC m=+1049.784666111" watchObservedRunningTime="2025-09-30 12:39:18.516125563 +0000 UTC m=+1049.785363209" Sep 30 12:39:18 crc kubenswrapper[4672]: I0930 12:39:18.571432 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-stw4v" podStartSLOduration=6.571415475 podStartE2EDuration="6.571415475s" podCreationTimestamp="2025-09-30 12:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:18.570870841 +0000 UTC m=+1049.840108487" watchObservedRunningTime="2025-09-30 12:39:18.571415475 +0000 UTC m=+1049.840653121" Sep 30 12:39:19 crc kubenswrapper[4672]: I0930 12:39:19.447870 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e757cd-3187-40f9-828a-cec497ae50dd" path="/var/lib/kubelet/pods/69e757cd-3187-40f9-828a-cec497ae50dd/volumes" Sep 30 12:39:19 crc kubenswrapper[4672]: I0930 12:39:19.451825 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2fff124-9dc4-46ee-a5fa-9cef98b3eed9" path="/var/lib/kubelet/pods/a2fff124-9dc4-46ee-a5fa-9cef98b3eed9/volumes" Sep 30 12:39:19 crc kubenswrapper[4672]: I0930 12:39:19.452792 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c515ed89-0f3e-4a37-b5e9-53602578d30a" path="/var/lib/kubelet/pods/c515ed89-0f3e-4a37-b5e9-53602578d30a/volumes" Sep 30 12:39:20 crc kubenswrapper[4672]: I0930 12:39:20.266959 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 12:39:20 crc kubenswrapper[4672]: I0930 12:39:20.538553 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:39:21 crc kubenswrapper[4672]: I0930 12:39:21.518846 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 12:39:23 crc kubenswrapper[4672]: I0930 12:39:23.582611 4672 generic.go:334] "Generic (PLEG): container finished" podID="8c431961-0987-401e-8486-c2dd0887721b" containerID="de810467bd2c80cebb514631aa663893dcb863c7f06ea2fde6b0efb04a9da6bf" exitCode=0 Sep 30 12:39:23 crc kubenswrapper[4672]: I0930 12:39:23.582671 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-stw4v" event={"ID":"8c431961-0987-401e-8486-c2dd0887721b","Type":"ContainerDied","Data":"de810467bd2c80cebb514631aa663893dcb863c7f06ea2fde6b0efb04a9da6bf"} Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.267332 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.273572 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.275785 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.511623 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.604941 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.850763 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.970093 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmbh6\" (UniqueName: \"kubernetes.io/projected/8c431961-0987-401e-8486-c2dd0887721b-kube-api-access-rmbh6\") pod \"8c431961-0987-401e-8486-c2dd0887721b\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.970215 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-config-data\") pod \"8c431961-0987-401e-8486-c2dd0887721b\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.970290 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-credential-keys\") pod \"8c431961-0987-401e-8486-c2dd0887721b\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.970311 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-fernet-keys\") pod \"8c431961-0987-401e-8486-c2dd0887721b\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.970348 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-combined-ca-bundle\") pod \"8c431961-0987-401e-8486-c2dd0887721b\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.970381 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-scripts\") pod \"8c431961-0987-401e-8486-c2dd0887721b\" (UID: \"8c431961-0987-401e-8486-c2dd0887721b\") " Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.976613 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8c431961-0987-401e-8486-c2dd0887721b" (UID: "8c431961-0987-401e-8486-c2dd0887721b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.976989 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-scripts" (OuterVolumeSpecName: "scripts") pod "8c431961-0987-401e-8486-c2dd0887721b" (UID: "8c431961-0987-401e-8486-c2dd0887721b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.977132 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8c431961-0987-401e-8486-c2dd0887721b" (UID: "8c431961-0987-401e-8486-c2dd0887721b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:25 crc kubenswrapper[4672]: I0930 12:39:25.977432 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c431961-0987-401e-8486-c2dd0887721b-kube-api-access-rmbh6" (OuterVolumeSpecName: "kube-api-access-rmbh6") pod "8c431961-0987-401e-8486-c2dd0887721b" (UID: "8c431961-0987-401e-8486-c2dd0887721b"). InnerVolumeSpecName "kube-api-access-rmbh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.007029 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c431961-0987-401e-8486-c2dd0887721b" (UID: "8c431961-0987-401e-8486-c2dd0887721b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.011466 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-config-data" (OuterVolumeSpecName: "config-data") pod "8c431961-0987-401e-8486-c2dd0887721b" (UID: "8c431961-0987-401e-8486-c2dd0887721b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.074714 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.074763 4672 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.074778 4672 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.074793 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.074805 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c431961-0987-401e-8486-c2dd0887721b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.074845 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmbh6\" (UniqueName: \"kubernetes.io/projected/8c431961-0987-401e-8486-c2dd0887721b-kube-api-access-rmbh6\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.615740 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-stw4v" event={"ID":"8c431961-0987-401e-8486-c2dd0887721b","Type":"ContainerDied","Data":"75ad1ebd64826d5673bd55b2276ddf8dbe9a5c8e246026c01acc0d882feded16"} Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.615982 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75ad1ebd64826d5673bd55b2276ddf8dbe9a5c8e246026c01acc0d882feded16" Sep 30 12:39:26 crc kubenswrapper[4672]: I0930 12:39:26.615802 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-stw4v" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.021213 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c5bf9886d-9nhb9"] Sep 30 12:39:27 crc kubenswrapper[4672]: E0930 12:39:27.021718 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c431961-0987-401e-8486-c2dd0887721b" containerName="keystone-bootstrap" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.021745 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c431961-0987-401e-8486-c2dd0887721b" containerName="keystone-bootstrap" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.021992 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c431961-0987-401e-8486-c2dd0887721b" containerName="keystone-bootstrap" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.025938 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.029819 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.030096 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.031220 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nbwb" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.031792 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.031962 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.032087 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.043045 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5bf9886d-9nhb9"] Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093586 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-internal-tls-certs\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093643 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-combined-ca-bundle\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093680 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-public-tls-certs\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093706 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt5rd\" (UniqueName: \"kubernetes.io/projected/1871c14e-9602-478e-888f-31d273376456-kube-api-access-vt5rd\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093771 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-config-data\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093787 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-fernet-keys\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093872 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-scripts\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.093890 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-credential-keys\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195233 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-scripts\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195287 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-credential-keys\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195325 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-internal-tls-certs\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195355 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-combined-ca-bundle\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195378 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-public-tls-certs\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195404 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt5rd\" (UniqueName: \"kubernetes.io/projected/1871c14e-9602-478e-888f-31d273376456-kube-api-access-vt5rd\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195457 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-config-data\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.195478 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-fernet-keys\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.200173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-config-data\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.200334 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-combined-ca-bundle\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.203567 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-scripts\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.204015 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-public-tls-certs\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.205485 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-credential-keys\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.205538 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-fernet-keys\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.213854 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1871c14e-9602-478e-888f-31d273376456-internal-tls-certs\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.233918 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt5rd\" (UniqueName: \"kubernetes.io/projected/1871c14e-9602-478e-888f-31d273376456-kube-api-access-vt5rd\") pod \"keystone-c5bf9886d-9nhb9\" (UID: \"1871c14e-9602-478e-888f-31d273376456\") " pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.350774 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.776745 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:27 crc kubenswrapper[4672]: I0930 12:39:27.776873 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:28 crc kubenswrapper[4672]: I0930 12:39:28.731911 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:28 crc kubenswrapper[4672]: I0930 12:39:28.732619 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api-log" containerID="cri-o://cd85d5296d5ee4279afa60f9335efff440ce347bd25c290bad82a98794365e84" gracePeriod=30 Sep 30 12:39:28 crc kubenswrapper[4672]: I0930 12:39:28.732712 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api" containerID="cri-o://51b2ccd237cd08254100818a64ed4d4867f98c2027d73c962717bddcf02ea835" gracePeriod=30 Sep 30 12:39:29 crc kubenswrapper[4672]: I0930 12:39:29.693727 4672 generic.go:334] "Generic (PLEG): container finished" podID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerID="51b2ccd237cd08254100818a64ed4d4867f98c2027d73c962717bddcf02ea835" exitCode=0 Sep 30 12:39:29 crc kubenswrapper[4672]: I0930 12:39:29.694042 4672 generic.go:334] "Generic (PLEG): container finished" podID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerID="cd85d5296d5ee4279afa60f9335efff440ce347bd25c290bad82a98794365e84" exitCode=143 Sep 30 12:39:29 crc kubenswrapper[4672]: I0930 12:39:29.694067 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"916c2fc8-25b0-4a10-82d3-6b3e51785690","Type":"ContainerDied","Data":"51b2ccd237cd08254100818a64ed4d4867f98c2027d73c962717bddcf02ea835"} Sep 30 12:39:29 crc kubenswrapper[4672]: I0930 12:39:29.694099 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"916c2fc8-25b0-4a10-82d3-6b3e51785690","Type":"ContainerDied","Data":"cd85d5296d5ee4279afa60f9335efff440ce347bd25c290bad82a98794365e84"} Sep 30 12:39:30 crc kubenswrapper[4672]: I0930 12:39:30.266962 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.160:9322/\": dial tcp 10.217.0.160:9322: connect: connection refused" Sep 30 12:39:30 crc kubenswrapper[4672]: I0930 12:39:30.267002 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9322/\": dial tcp 10.217.0.160:9322: connect: connection refused" Sep 30 12:39:35 crc kubenswrapper[4672]: I0930 12:39:35.267434 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9322/\": dial tcp 10.217.0.160:9322: connect: connection refused" Sep 30 12:39:35 crc kubenswrapper[4672]: I0930 12:39:35.267485 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.160:9322/\": dial tcp 10.217.0.160:9322: connect: connection refused" Sep 30 12:39:37 crc kubenswrapper[4672]: I0930 12:39:37.282549 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:39:37 crc kubenswrapper[4672]: E0930 12:39:37.774340 4672 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Sep 30 12:39:37 crc kubenswrapper[4672]: E0930 12:39:37.774434 4672 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.83:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Sep 30 12:39:37 crc kubenswrapper[4672]: E0930 12:39:37.774707 4672 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.83:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tzgds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-llp7f_openstack(9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 12:39:37 crc kubenswrapper[4672]: E0930 12:39:37.776620 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-llp7f" podUID="9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" Sep 30 12:39:37 crc kubenswrapper[4672]: I0930 12:39:37.787370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e1e7299-4168-4aba-917d-dea25752e400","Type":"ContainerStarted","Data":"3461f6d18872c01899f6e3b26a4fe22b34d4302715ddc7b35eee6a4dc685be53"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.152811 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.230404 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916c2fc8-25b0-4a10-82d3-6b3e51785690-logs\") pod \"916c2fc8-25b0-4a10-82d3-6b3e51785690\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.230558 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-custom-prometheus-ca\") pod \"916c2fc8-25b0-4a10-82d3-6b3e51785690\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.230592 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-config-data\") pod \"916c2fc8-25b0-4a10-82d3-6b3e51785690\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.231814 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916c2fc8-25b0-4a10-82d3-6b3e51785690-logs" (OuterVolumeSpecName: "logs") pod "916c2fc8-25b0-4a10-82d3-6b3e51785690" (UID: "916c2fc8-25b0-4a10-82d3-6b3e51785690"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.234898 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nw2h\" (UniqueName: \"kubernetes.io/projected/916c2fc8-25b0-4a10-82d3-6b3e51785690-kube-api-access-5nw2h\") pod \"916c2fc8-25b0-4a10-82d3-6b3e51785690\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.234966 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-combined-ca-bundle\") pod \"916c2fc8-25b0-4a10-82d3-6b3e51785690\" (UID: \"916c2fc8-25b0-4a10-82d3-6b3e51785690\") " Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.238160 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916c2fc8-25b0-4a10-82d3-6b3e51785690-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.267840 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916c2fc8-25b0-4a10-82d3-6b3e51785690-kube-api-access-5nw2h" (OuterVolumeSpecName: "kube-api-access-5nw2h") pod "916c2fc8-25b0-4a10-82d3-6b3e51785690" (UID: "916c2fc8-25b0-4a10-82d3-6b3e51785690"). InnerVolumeSpecName "kube-api-access-5nw2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.339737 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nw2h\" (UniqueName: \"kubernetes.io/projected/916c2fc8-25b0-4a10-82d3-6b3e51785690-kube-api-access-5nw2h\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.345518 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.348410 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "916c2fc8-25b0-4a10-82d3-6b3e51785690" (UID: "916c2fc8-25b0-4a10-82d3-6b3e51785690"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.366938 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "916c2fc8-25b0-4a10-82d3-6b3e51785690" (UID: "916c2fc8-25b0-4a10-82d3-6b3e51785690"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.397240 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5bf9886d-9nhb9"] Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.434575 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-config-data" (OuterVolumeSpecName: "config-data") pod "916c2fc8-25b0-4a10-82d3-6b3e51785690" (UID: "916c2fc8-25b0-4a10-82d3-6b3e51785690"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.441443 4672 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.441489 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.441502 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916c2fc8-25b0-4a10-82d3-6b3e51785690-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.808188 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e1e7299-4168-4aba-917d-dea25752e400","Type":"ContainerStarted","Data":"cf68c535814b18f82b62c7e70459b1f0472be9e598870bcc2076ead00dc819b4"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.810449 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerStarted","Data":"8d223d3a6e9db292e6fb7098fd6f146e3fc9591954560d7a29c4a778db39c109"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.822506 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cnkc" event={"ID":"d637de0b-aed1-45ef-9d86-c6f7c2f188e1","Type":"ContainerStarted","Data":"c67a7804189bf25b06b8879f6f5ca3c83481b53a850e69c71c8db005c920aca8"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.825829 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74285663-1d6c-4a5f-8fe4-5406010f7ad7","Type":"ContainerStarted","Data":"c4fa4b4697a7d4e559c9dfefb63c7925003f2cf08d57e152c3d0ba680a3a9338"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.827602 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerStarted","Data":"0cd2bc9ec87f2b1fac46321bc8d94fa135c803ac11bdb697cd95415925106220"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.839518 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0944504c-77dc-42f3-a981-723fea76118c","Type":"ContainerStarted","Data":"e1d5993741c08378b527d7c1386a812cecb358565d264ec57c09b0e3595dbd8d"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.845906 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b6c9474-6tpzt" event={"ID":"2659b35e-ecb1-416b-8a94-690759645536","Type":"ContainerStarted","Data":"e390d0504a05dfd0a23382163bf0dbc45fb2a2479814872e2d7520e098e139de"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.848236 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5bf9886d-9nhb9" event={"ID":"1871c14e-9602-478e-888f-31d273376456","Type":"ContainerStarted","Data":"9f409441aded2d70b9dc25fdf0eaa08f59c89c5d36c096ba55ae067e6b5e4ec8"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.848311 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5bf9886d-9nhb9" event={"ID":"1871c14e-9602-478e-888f-31d273376456","Type":"ContainerStarted","Data":"1ef0021c766704b56dbc997c57c7d27feb7f0e8fefdd66b9abd14b987445443b"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.848569 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.851788 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9kp7h" event={"ID":"c87bb11c-a04e-4018-ba41-d628795a926e","Type":"ContainerStarted","Data":"1f3d33c5e0e339b002c349b5c492a03d7dbd38148a6d0a3d902d7a6298e9d6fc"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.855351 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.860508 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"916c2fc8-25b0-4a10-82d3-6b3e51785690","Type":"ContainerDied","Data":"05afdccf7785943f6cb29d0d4db57b287a304f04daccab6e6e575da3ea1622f7"} Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.860594 4672 scope.go:117] "RemoveContainer" containerID="51b2ccd237cd08254100818a64ed4d4867f98c2027d73c962717bddcf02ea835" Sep 30 12:39:38 crc kubenswrapper[4672]: E0930 12:39:38.865313 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.83:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-llp7f" podUID="9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.899305 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=14.354131828 podStartE2EDuration="34.899286113s" podCreationTimestamp="2025-09-30 12:39:04 +0000 UTC" firstStartedPulling="2025-09-30 12:39:17.178289968 +0000 UTC m=+1048.447527614" lastFinishedPulling="2025-09-30 12:39:37.723444253 +0000 UTC m=+1068.992681899" observedRunningTime="2025-09-30 12:39:38.89485484 +0000 UTC m=+1070.164092486" watchObservedRunningTime="2025-09-30 12:39:38.899286113 +0000 UTC m=+1070.168523759" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.908166 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9cnkc" podStartSLOduration=13.944656544 podStartE2EDuration="34.908144859s" podCreationTimestamp="2025-09-30 12:39:04 +0000 UTC" firstStartedPulling="2025-09-30 12:39:16.883487631 +0000 UTC m=+1048.152725277" lastFinishedPulling="2025-09-30 12:39:37.846975946 +0000 UTC m=+1069.116213592" observedRunningTime="2025-09-30 12:39:38.865684745 +0000 UTC m=+1070.134922391" watchObservedRunningTime="2025-09-30 12:39:38.908144859 +0000 UTC m=+1070.177382495" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.958091 4672 scope.go:117] "RemoveContainer" containerID="cd85d5296d5ee4279afa60f9335efff440ce347bd25c290bad82a98794365e84" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.961966 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c5bf9886d-9nhb9" podStartSLOduration=11.961950673 podStartE2EDuration="11.961950673s" podCreationTimestamp="2025-09-30 12:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:38.956887794 +0000 UTC m=+1070.226125440" watchObservedRunningTime="2025-09-30 12:39:38.961950673 +0000 UTC m=+1070.231188319" Sep 30 12:39:38 crc kubenswrapper[4672]: I0930 12:39:38.963860 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-844b6c9474-6tpzt" podStartSLOduration=31.963852911 podStartE2EDuration="31.963852911s" podCreationTimestamp="2025-09-30 12:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:38.929699939 +0000 UTC m=+1070.198937585" watchObservedRunningTime="2025-09-30 12:39:38.963852911 +0000 UTC m=+1070.233090557" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.023604 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=14.179302825 podStartE2EDuration="35.023578036s" podCreationTimestamp="2025-09-30 12:39:04 +0000 UTC" firstStartedPulling="2025-09-30 12:39:16.879168111 +0000 UTC m=+1048.148405757" lastFinishedPulling="2025-09-30 12:39:37.723443322 +0000 UTC m=+1068.992680968" observedRunningTime="2025-09-30 12:39:39.010889582 +0000 UTC m=+1070.280127238" watchObservedRunningTime="2025-09-30 12:39:39.023578036 +0000 UTC m=+1070.292815682" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.030845 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9kp7h" podStartSLOduration=3.884043471 podStartE2EDuration="45.030832321s" podCreationTimestamp="2025-09-30 12:38:54 +0000 UTC" firstStartedPulling="2025-09-30 12:38:56.719085709 +0000 UTC m=+1027.988323355" lastFinishedPulling="2025-09-30 12:39:37.865874559 +0000 UTC m=+1069.135112205" observedRunningTime="2025-09-30 12:39:39.025119116 +0000 UTC m=+1070.294356762" watchObservedRunningTime="2025-09-30 12:39:39.030832321 +0000 UTC m=+1070.300069967" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.051441 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.069755 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.082014 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:39 crc kubenswrapper[4672]: E0930 12:39:39.082401 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.082417 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api" Sep 30 12:39:39 crc kubenswrapper[4672]: E0930 12:39:39.082441 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api-log" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.082447 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api-log" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.082658 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api-log" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.082685 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" containerName="watcher-api" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.083685 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.109480 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.149054 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.149247 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.149395 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.155196 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.155289 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-config-data\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.155368 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2j62\" (UniqueName: \"kubernetes.io/projected/179ce1e6-946c-4e3c-97d6-38764daf4214-kube-api-access-c2j62\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.155387 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.155419 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.155501 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-public-tls-certs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.155568 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179ce1e6-946c-4e3c-97d6-38764daf4214-logs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.257274 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.257371 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-public-tls-certs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.257407 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179ce1e6-946c-4e3c-97d6-38764daf4214-logs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.257463 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.257512 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-config-data\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.257547 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2j62\" (UniqueName: \"kubernetes.io/projected/179ce1e6-946c-4e3c-97d6-38764daf4214-kube-api-access-c2j62\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.257565 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.273335 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.273577 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/179ce1e6-946c-4e3c-97d6-38764daf4214-logs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.280244 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-public-tls-certs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.297017 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.297851 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-config-data\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.299926 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2j62\" (UniqueName: \"kubernetes.io/projected/179ce1e6-946c-4e3c-97d6-38764daf4214-kube-api-access-c2j62\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.333095 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/179ce1e6-946c-4e3c-97d6-38764daf4214-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"179ce1e6-946c-4e3c-97d6-38764daf4214\") " pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.435576 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916c2fc8-25b0-4a10-82d3-6b3e51785690" path="/var/lib/kubelet/pods/916c2fc8-25b0-4a10-82d3-6b3e51785690/volumes" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.532629 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 12:39:39 crc kubenswrapper[4672]: I0930 12:39:39.889374 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74285663-1d6c-4a5f-8fe4-5406010f7ad7","Type":"ContainerStarted","Data":"d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785"} Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.180153 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 12:39:40 crc kubenswrapper[4672]: W0930 12:39:40.213113 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179ce1e6_946c_4e3c_97d6_38764daf4214.slice/crio-8b1cc11c22253b7e22caa8518a841e35326cc3e3f6c08da67d7a24d23ed7f78f WatchSource:0}: Error finding container 8b1cc11c22253b7e22caa8518a841e35326cc3e3f6c08da67d7a24d23ed7f78f: Status 404 returned error can't find the container with id 8b1cc11c22253b7e22caa8518a841e35326cc3e3f6c08da67d7a24d23ed7f78f Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.327411 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.907425 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"179ce1e6-946c-4e3c-97d6-38764daf4214","Type":"ContainerStarted","Data":"76d8e21e8d5e4dbbbdba60533e17b7e57aa88aa0da8561558cb9039fe5d99e0f"} Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.908046 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"179ce1e6-946c-4e3c-97d6-38764daf4214","Type":"ContainerStarted","Data":"5ff98e37025afb797844fd4489c7250f5689b162c285e93ebc636af40f09a847"} Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.908062 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"179ce1e6-946c-4e3c-97d6-38764daf4214","Type":"ContainerStarted","Data":"8b1cc11c22253b7e22caa8518a841e35326cc3e3f6c08da67d7a24d23ed7f78f"} Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.909673 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.912106 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="179ce1e6-946c-4e3c-97d6-38764daf4214" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.169:9322/\": dial tcp 10.217.0.169:9322: connect: connection refused" Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.913377 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74285663-1d6c-4a5f-8fe4-5406010f7ad7","Type":"ContainerStarted","Data":"fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487"} Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.926384 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e1e7299-4168-4aba-917d-dea25752e400","Type":"ContainerStarted","Data":"47016035c37933ccc06c63f168a8b55b6a70d23103e1681faca987a5aaac055d"} Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.934786 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=1.93477075 podStartE2EDuration="1.93477075s" podCreationTimestamp="2025-09-30 12:39:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:40.932499502 +0000 UTC m=+1072.201737148" watchObservedRunningTime="2025-09-30 12:39:40.93477075 +0000 UTC m=+1072.204008396" Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.967466 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=23.967444324 podStartE2EDuration="23.967444324s" podCreationTimestamp="2025-09-30 12:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:40.958610758 +0000 UTC m=+1072.227848424" watchObservedRunningTime="2025-09-30 12:39:40.967444324 +0000 UTC m=+1072.236681970" Sep 30 12:39:40 crc kubenswrapper[4672]: I0930 12:39:40.996326 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=23.996311121 podStartE2EDuration="23.996311121s" podCreationTimestamp="2025-09-30 12:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:40.993253063 +0000 UTC m=+1072.262490719" watchObservedRunningTime="2025-09-30 12:39:40.996311121 +0000 UTC m=+1072.265548767" Sep 30 12:39:41 crc kubenswrapper[4672]: I0930 12:39:41.047792 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:42 crc kubenswrapper[4672]: I0930 12:39:42.879919 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:39:43 crc kubenswrapper[4672]: I0930 12:39:43.955672 4672 generic.go:334] "Generic (PLEG): container finished" podID="c87bb11c-a04e-4018-ba41-d628795a926e" containerID="1f3d33c5e0e339b002c349b5c492a03d7dbd38148a6d0a3d902d7a6298e9d6fc" exitCode=0 Sep 30 12:39:43 crc kubenswrapper[4672]: I0930 12:39:43.955710 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9kp7h" event={"ID":"c87bb11c-a04e-4018-ba41-d628795a926e","Type":"ContainerDied","Data":"1f3d33c5e0e339b002c349b5c492a03d7dbd38148a6d0a3d902d7a6298e9d6fc"} Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.099342 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.533380 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.971786 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerStarted","Data":"aff9da5c297f27e5f62f1e7cceb06e20b320e3340926ac4477a12a95bb4bf9ed"} Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.973622 4672 generic.go:334] "Generic (PLEG): container finished" podID="d637de0b-aed1-45ef-9d86-c6f7c2f188e1" containerID="c67a7804189bf25b06b8879f6f5ca3c83481b53a850e69c71c8db005c920aca8" exitCode=0 Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.973694 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cnkc" event={"ID":"d637de0b-aed1-45ef-9d86-c6f7c2f188e1","Type":"ContainerDied","Data":"c67a7804189bf25b06b8879f6f5ca3c83481b53a850e69c71c8db005c920aca8"} Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.977374 4672 generic.go:334] "Generic (PLEG): container finished" podID="4f1bee84-650b-4f0b-a657-e6701ee51823" containerID="0cd2bc9ec87f2b1fac46321bc8d94fa135c803ac11bdb697cd95415925106220" exitCode=1 Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.977461 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerDied","Data":"0cd2bc9ec87f2b1fac46321bc8d94fa135c803ac11bdb697cd95415925106220"} Sep 30 12:39:44 crc kubenswrapper[4672]: I0930 12:39:44.978185 4672 scope.go:117] "RemoveContainer" containerID="0cd2bc9ec87f2b1fac46321bc8d94fa135c803ac11bdb697cd95415925106220" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.327297 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.331038 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9kp7h" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.375307 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.382371 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.382425 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.481429 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2sb2\" (UniqueName: \"kubernetes.io/projected/c87bb11c-a04e-4018-ba41-d628795a926e-kube-api-access-t2sb2\") pod \"c87bb11c-a04e-4018-ba41-d628795a926e\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.482038 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-config-data\") pod \"c87bb11c-a04e-4018-ba41-d628795a926e\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.482088 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-scripts\") pod \"c87bb11c-a04e-4018-ba41-d628795a926e\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.482113 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-combined-ca-bundle\") pod \"c87bb11c-a04e-4018-ba41-d628795a926e\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.482251 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c87bb11c-a04e-4018-ba41-d628795a926e-logs\") pod \"c87bb11c-a04e-4018-ba41-d628795a926e\" (UID: \"c87bb11c-a04e-4018-ba41-d628795a926e\") " Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.482574 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c87bb11c-a04e-4018-ba41-d628795a926e-logs" (OuterVolumeSpecName: "logs") pod "c87bb11c-a04e-4018-ba41-d628795a926e" (UID: "c87bb11c-a04e-4018-ba41-d628795a926e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.484022 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c87bb11c-a04e-4018-ba41-d628795a926e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.505036 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87bb11c-a04e-4018-ba41-d628795a926e-kube-api-access-t2sb2" (OuterVolumeSpecName: "kube-api-access-t2sb2") pod "c87bb11c-a04e-4018-ba41-d628795a926e" (UID: "c87bb11c-a04e-4018-ba41-d628795a926e"). InnerVolumeSpecName "kube-api-access-t2sb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.505463 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-scripts" (OuterVolumeSpecName: "scripts") pod "c87bb11c-a04e-4018-ba41-d628795a926e" (UID: "c87bb11c-a04e-4018-ba41-d628795a926e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.523168 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c87bb11c-a04e-4018-ba41-d628795a926e" (UID: "c87bb11c-a04e-4018-ba41-d628795a926e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.523633 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-config-data" (OuterVolumeSpecName: "config-data") pod "c87bb11c-a04e-4018-ba41-d628795a926e" (UID: "c87bb11c-a04e-4018-ba41-d628795a926e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.586822 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.586866 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.586876 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c87bb11c-a04e-4018-ba41-d628795a926e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.586887 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2sb2\" (UniqueName: \"kubernetes.io/projected/c87bb11c-a04e-4018-ba41-d628795a926e-kube-api-access-t2sb2\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.992706 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9kp7h" event={"ID":"c87bb11c-a04e-4018-ba41-d628795a926e","Type":"ContainerDied","Data":"0bbc42f823b080f05094ace0f4292c8c11d29c562e8dffad8638e4296e2a9b1d"} Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.992744 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bbc42f823b080f05094ace0f4292c8c11d29c562e8dffad8638e4296e2a9b1d" Sep 30 12:39:45 crc kubenswrapper[4672]: I0930 12:39:45.992807 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9kp7h" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.013537 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerStarted","Data":"50db62107d03a042bff857d072808d9278376d38aa180a36c8e188d5d240b6c0"} Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.050828 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.124902 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b77669d6-hlcjq"] Sep 30 12:39:46 crc kubenswrapper[4672]: E0930 12:39:46.125363 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87bb11c-a04e-4018-ba41-d628795a926e" containerName="placement-db-sync" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.125387 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87bb11c-a04e-4018-ba41-d628795a926e" containerName="placement-db-sync" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.125609 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87bb11c-a04e-4018-ba41-d628795a926e" containerName="placement-db-sync" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.126725 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.129933 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.130096 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.130231 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.130346 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zqw22" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.130679 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.143469 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b77669d6-hlcjq"] Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.206245 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfplk\" (UniqueName: \"kubernetes.io/projected/1834109c-113d-4231-94e6-0796ef06015d-kube-api-access-jfplk\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.206581 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-scripts\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.206599 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-combined-ca-bundle\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.206686 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1834109c-113d-4231-94e6-0796ef06015d-logs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.206704 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-public-tls-certs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.206829 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-internal-tls-certs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.206871 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-config-data\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.308545 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-internal-tls-certs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.308615 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-config-data\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.308662 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfplk\" (UniqueName: \"kubernetes.io/projected/1834109c-113d-4231-94e6-0796ef06015d-kube-api-access-jfplk\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.308683 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-scripts\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.308697 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-combined-ca-bundle\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.308756 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1834109c-113d-4231-94e6-0796ef06015d-logs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.308773 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-public-tls-certs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.312805 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1834109c-113d-4231-94e6-0796ef06015d-logs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.316728 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-config-data\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.318044 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-combined-ca-bundle\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.318350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-public-tls-certs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.319633 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-scripts\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.320864 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1834109c-113d-4231-94e6-0796ef06015d-internal-tls-certs\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.329109 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfplk\" (UniqueName: \"kubernetes.io/projected/1834109c-113d-4231-94e6-0796ef06015d-kube-api-access-jfplk\") pod \"placement-6b77669d6-hlcjq\" (UID: \"1834109c-113d-4231-94e6-0796ef06015d\") " pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.424874 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.485939 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.513086 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-db-sync-config-data\") pod \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.513176 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57c2w\" (UniqueName: \"kubernetes.io/projected/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-kube-api-access-57c2w\") pod \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.513299 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-combined-ca-bundle\") pod \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\" (UID: \"d637de0b-aed1-45ef-9d86-c6f7c2f188e1\") " Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.516765 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d637de0b-aed1-45ef-9d86-c6f7c2f188e1" (UID: "d637de0b-aed1-45ef-9d86-c6f7c2f188e1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.518255 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-kube-api-access-57c2w" (OuterVolumeSpecName: "kube-api-access-57c2w") pod "d637de0b-aed1-45ef-9d86-c6f7c2f188e1" (UID: "d637de0b-aed1-45ef-9d86-c6f7c2f188e1"). InnerVolumeSpecName "kube-api-access-57c2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.568398 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d637de0b-aed1-45ef-9d86-c6f7c2f188e1" (UID: "d637de0b-aed1-45ef-9d86-c6f7c2f188e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.615599 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57c2w\" (UniqueName: \"kubernetes.io/projected/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-kube-api-access-57c2w\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.615865 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.615878 4672 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d637de0b-aed1-45ef-9d86-c6f7c2f188e1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:46 crc kubenswrapper[4672]: I0930 12:39:46.996907 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b77669d6-hlcjq"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.038791 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b77669d6-hlcjq" event={"ID":"1834109c-113d-4231-94e6-0796ef06015d","Type":"ContainerStarted","Data":"9c78043930a8e0e6c42da4709b7879699e3f392702d36f6b351c673573a485e5"} Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.041324 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cnkc" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.042373 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cnkc" event={"ID":"d637de0b-aed1-45ef-9d86-c6f7c2f188e1","Type":"ContainerDied","Data":"7f0da9ba6f558c390034153bf121c1b68733ce24abfd4c3da5aa859fff0b2b5d"} Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.042408 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f0da9ba6f558c390034153bf121c1b68733ce24abfd4c3da5aa859fff0b2b5d" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.252413 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6bd7cdf79c-ddf9q"] Sep 30 12:39:47 crc kubenswrapper[4672]: E0930 12:39:47.253324 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d637de0b-aed1-45ef-9d86-c6f7c2f188e1" containerName="barbican-db-sync" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.253351 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d637de0b-aed1-45ef-9d86-c6f7c2f188e1" containerName="barbican-db-sync" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.253576 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d637de0b-aed1-45ef-9d86-c6f7c2f188e1" containerName="barbican-db-sync" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.254813 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.260018 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-w4tr6" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.265660 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.265660 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.274512 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-654fbcfdf6-vvhwm"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.276537 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.305648 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.314841 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bd7cdf79c-ddf9q"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.330678 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-config-data-custom\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331005 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e338321e-04aa-4ed3-8b3c-0baf6888f64f-logs\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331085 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-combined-ca-bundle\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331162 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-combined-ca-bundle\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331249 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-config-data\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331392 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-config-data-custom\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331491 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvrvl\" (UniqueName: \"kubernetes.io/projected/7212b048-8992-4678-b985-05a5c1fc8818-kube-api-access-fvrvl\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331589 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-config-data\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331665 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7212b048-8992-4678-b985-05a5c1fc8818-logs\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.331747 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g248x\" (UniqueName: \"kubernetes.io/projected/e338321e-04aa-4ed3-8b3c-0baf6888f64f-kube-api-access-g248x\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.335393 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-654fbcfdf6-vvhwm"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.417070 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-87bdb45dc-rd86m"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.428004 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.436043 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-config-data\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.437735 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-config-data-custom\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.437807 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvrvl\" (UniqueName: \"kubernetes.io/projected/7212b048-8992-4678-b985-05a5c1fc8818-kube-api-access-fvrvl\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.437851 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-config-data\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.437875 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7212b048-8992-4678-b985-05a5c1fc8818-logs\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.437899 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g248x\" (UniqueName: \"kubernetes.io/projected/e338321e-04aa-4ed3-8b3c-0baf6888f64f-kube-api-access-g248x\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.437959 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-config-data-custom\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.438014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e338321e-04aa-4ed3-8b3c-0baf6888f64f-logs\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.438041 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-combined-ca-bundle\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.438058 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-combined-ca-bundle\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.438631 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7212b048-8992-4678-b985-05a5c1fc8818-logs\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.439069 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e338321e-04aa-4ed3-8b3c-0baf6888f64f-logs\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.450470 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-combined-ca-bundle\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.455411 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-config-data\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.456359 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-config-data\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.457894 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e338321e-04aa-4ed3-8b3c-0baf6888f64f-config-data-custom\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.461314 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-config-data-custom\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.463110 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7212b048-8992-4678-b985-05a5c1fc8818-combined-ca-bundle\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.466567 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvrvl\" (UniqueName: \"kubernetes.io/projected/7212b048-8992-4678-b985-05a5c1fc8818-kube-api-access-fvrvl\") pod \"barbican-keystone-listener-654fbcfdf6-vvhwm\" (UID: \"7212b048-8992-4678-b985-05a5c1fc8818\") " pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.475829 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g248x\" (UniqueName: \"kubernetes.io/projected/e338321e-04aa-4ed3-8b3c-0baf6888f64f-kube-api-access-g248x\") pod \"barbican-worker-6bd7cdf79c-ddf9q\" (UID: \"e338321e-04aa-4ed3-8b3c-0baf6888f64f\") " pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.511614 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-87bdb45dc-rd86m"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.511666 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8679776b6d-5ltf9"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.516586 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.520475 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.529513 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8679776b6d-5ltf9"] Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.540012 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-nb\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.540064 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsb8k\" (UniqueName: \"kubernetes.io/projected/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-kube-api-access-lsb8k\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.540093 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-sb\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.540112 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-swift-storage-0\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.540158 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-config\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.540183 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-svc\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.616394 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.641663 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-config\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.641711 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b5c532-f855-46f2-967b-c53fc7f0ffee-logs\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.641742 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-svc\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.641826 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8tzs\" (UniqueName: \"kubernetes.io/projected/91b5c532-f855-46f2-967b-c53fc7f0ffee-kube-api-access-b8tzs\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.642457 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data-custom\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.642547 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-nb\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.642632 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsb8k\" (UniqueName: \"kubernetes.io/projected/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-kube-api-access-lsb8k\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.642665 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-combined-ca-bundle\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.642740 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-sb\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.642822 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-swift-storage-0\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.642908 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.648709 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-config\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.649590 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-sb\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.650128 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-swift-storage-0\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.655109 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.657432 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-svc\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.657594 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-nb\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.685543 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsb8k\" (UniqueName: \"kubernetes.io/projected/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-kube-api-access-lsb8k\") pod \"dnsmasq-dns-87bdb45dc-rd86m\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.749359 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-combined-ca-bundle\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.749431 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.749469 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b5c532-f855-46f2-967b-c53fc7f0ffee-logs\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.749496 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8tzs\" (UniqueName: \"kubernetes.io/projected/91b5c532-f855-46f2-967b-c53fc7f0ffee-kube-api-access-b8tzs\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.749572 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data-custom\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.757590 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b5c532-f855-46f2-967b-c53fc7f0ffee-logs\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.766129 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-combined-ca-bundle\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.767752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.780092 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.781085 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data-custom\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.807425 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8tzs\" (UniqueName: \"kubernetes.io/projected/91b5c532-f855-46f2-967b-c53fc7f0ffee-kube-api-access-b8tzs\") pod \"barbican-api-8679776b6d-5ltf9\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:47 crc kubenswrapper[4672]: I0930 12:39:47.906078 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.054398 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.054753 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.057664 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-844b6c9474-6tpzt" podUID="2659b35e-ecb1-416b-8a94-690759645536" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.164:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.164:8443: connect: connection refused" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.099025 4672 generic.go:334] "Generic (PLEG): container finished" podID="4dcfda15-b815-4733-b3c7-0312473f7355" containerID="2ca40cc4750bf50f77b8f00e761b61a2c75ac13af658d7a70bb22c83ab627d31" exitCode=137 Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.099072 4672 generic.go:334] "Generic (PLEG): container finished" podID="4dcfda15-b815-4733-b3c7-0312473f7355" containerID="501b2136d15d3cf240eff2b922fac22c611eb5f0c9b5a54fd75a073a669049c1" exitCode=137 Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.099146 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f749887b9-hl4rd" event={"ID":"4dcfda15-b815-4733-b3c7-0312473f7355","Type":"ContainerDied","Data":"2ca40cc4750bf50f77b8f00e761b61a2c75ac13af658d7a70bb22c83ab627d31"} Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.099214 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f749887b9-hl4rd" event={"ID":"4dcfda15-b815-4733-b3c7-0312473f7355","Type":"ContainerDied","Data":"501b2136d15d3cf240eff2b922fac22c611eb5f0c9b5a54fd75a073a669049c1"} Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.110862 4672 generic.go:334] "Generic (PLEG): container finished" podID="96f92d5d-ba87-4de0-955d-998845ea9010" containerID="586c3bebe202fe92c0504e86b49cf04794ba5b560a39275854f3a1033b6a055d" exitCode=137 Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.110896 4672 generic.go:334] "Generic (PLEG): container finished" podID="96f92d5d-ba87-4de0-955d-998845ea9010" containerID="6fe46b3cd368d55265ed69ecd7caa903a2d78380fec92da12e77686aa3dddc3a" exitCode=137 Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.110991 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567d7b57-5g9zw" event={"ID":"96f92d5d-ba87-4de0-955d-998845ea9010","Type":"ContainerDied","Data":"586c3bebe202fe92c0504e86b49cf04794ba5b560a39275854f3a1033b6a055d"} Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.111034 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567d7b57-5g9zw" event={"ID":"96f92d5d-ba87-4de0-955d-998845ea9010","Type":"ContainerDied","Data":"6fe46b3cd368d55265ed69ecd7caa903a2d78380fec92da12e77686aa3dddc3a"} Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.155312 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.172872 4672 generic.go:334] "Generic (PLEG): container finished" podID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerID="9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af" exitCode=137 Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.172909 4672 generic.go:334] "Generic (PLEG): container finished" podID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerID="dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1" exitCode=137 Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.172983 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548589fd89-v67ld" event={"ID":"8dd596d1-129e-4bb2-9e9b-dac1d09323d2","Type":"ContainerDied","Data":"9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af"} Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.173011 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548589fd89-v67ld" event={"ID":"8dd596d1-129e-4bb2-9e9b-dac1d09323d2","Type":"ContainerDied","Data":"dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1"} Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.173027 4672 scope.go:117] "RemoveContainer" containerID="9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.208290 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b77669d6-hlcjq" event={"ID":"1834109c-113d-4231-94e6-0796ef06015d","Type":"ContainerStarted","Data":"ee3569739dc3b94a2cec9b207903f763c543e8ab919c9111f33153168e9aec6d"} Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.285597 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-config-data\") pod \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.285690 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-scripts\") pod \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.285726 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-horizon-secret-key\") pod \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.285821 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-logs\") pod \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.285898 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l9cb\" (UniqueName: \"kubernetes.io/projected/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-kube-api-access-9l9cb\") pod \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\" (UID: \"8dd596d1-129e-4bb2-9e9b-dac1d09323d2\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.296490 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-kube-api-access-9l9cb" (OuterVolumeSpecName: "kube-api-access-9l9cb") pod "8dd596d1-129e-4bb2-9e9b-dac1d09323d2" (UID: "8dd596d1-129e-4bb2-9e9b-dac1d09323d2"). InnerVolumeSpecName "kube-api-access-9l9cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.299655 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-logs" (OuterVolumeSpecName: "logs") pod "8dd596d1-129e-4bb2-9e9b-dac1d09323d2" (UID: "8dd596d1-129e-4bb2-9e9b-dac1d09323d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.333453 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8dd596d1-129e-4bb2-9e9b-dac1d09323d2" (UID: "8dd596d1-129e-4bb2-9e9b-dac1d09323d2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.336480 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-scripts" (OuterVolumeSpecName: "scripts") pod "8dd596d1-129e-4bb2-9e9b-dac1d09323d2" (UID: "8dd596d1-129e-4bb2-9e9b-dac1d09323d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.344803 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-config-data" (OuterVolumeSpecName: "config-data") pod "8dd596d1-129e-4bb2-9e9b-dac1d09323d2" (UID: "8dd596d1-129e-4bb2-9e9b-dac1d09323d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.375231 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.375847 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.376501 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.376759 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.391644 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l9cb\" (UniqueName: \"kubernetes.io/projected/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-kube-api-access-9l9cb\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.391685 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.391698 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.391709 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.391719 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd596d1-129e-4bb2-9e9b-dac1d09323d2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.424821 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.424848 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.424858 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.424865 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.424874 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.428995 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.512693 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.540786 4672 scope.go:117] "RemoveContainer" containerID="dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.556330 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.670369 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bd7cdf79c-ddf9q"] Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.785709 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.812024 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.856726 4672 scope.go:117] "RemoveContainer" containerID="9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af" Sep 30 12:39:48 crc kubenswrapper[4672]: E0930 12:39:48.857537 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af\": container with ID starting with 9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af not found: ID does not exist" containerID="9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.857579 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af"} err="failed to get container status \"9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af\": rpc error: code = NotFound desc = could not find container \"9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af\": container with ID starting with 9b9337d34ba5a7760d1895b7d4882e842cb61d0a824ae87f39ace0e82aaf87af not found: ID does not exist" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.857608 4672 scope.go:117] "RemoveContainer" containerID="dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1" Sep 30 12:39:48 crc kubenswrapper[4672]: E0930 12:39:48.857841 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1\": container with ID starting with dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1 not found: ID does not exist" containerID="dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.857874 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1"} err="failed to get container status \"dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1\": rpc error: code = NotFound desc = could not find container \"dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1\": container with ID starting with dce429d2825b6bacf88cae8bbe4c173141d791716ef64eb0fe317d4e29e4bcc1 not found: ID does not exist" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.882508 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-654fbcfdf6-vvhwm"] Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.907805 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-87bdb45dc-rd86m"] Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915039 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-scripts\") pod \"4dcfda15-b815-4733-b3c7-0312473f7355\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915146 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptk7m\" (UniqueName: \"kubernetes.io/projected/96f92d5d-ba87-4de0-955d-998845ea9010-kube-api-access-ptk7m\") pod \"96f92d5d-ba87-4de0-955d-998845ea9010\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915177 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dcfda15-b815-4733-b3c7-0312473f7355-horizon-secret-key\") pod \"4dcfda15-b815-4733-b3c7-0312473f7355\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915214 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dcfda15-b815-4733-b3c7-0312473f7355-logs\") pod \"4dcfda15-b815-4733-b3c7-0312473f7355\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915232 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lpm6\" (UniqueName: \"kubernetes.io/projected/4dcfda15-b815-4733-b3c7-0312473f7355-kube-api-access-6lpm6\") pod \"4dcfda15-b815-4733-b3c7-0312473f7355\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915259 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96f92d5d-ba87-4de0-955d-998845ea9010-logs\") pod \"96f92d5d-ba87-4de0-955d-998845ea9010\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915356 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-config-data\") pod \"96f92d5d-ba87-4de0-955d-998845ea9010\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915389 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96f92d5d-ba87-4de0-955d-998845ea9010-horizon-secret-key\") pod \"96f92d5d-ba87-4de0-955d-998845ea9010\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915414 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-config-data\") pod \"4dcfda15-b815-4733-b3c7-0312473f7355\" (UID: \"4dcfda15-b815-4733-b3c7-0312473f7355\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.915482 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-scripts\") pod \"96f92d5d-ba87-4de0-955d-998845ea9010\" (UID: \"96f92d5d-ba87-4de0-955d-998845ea9010\") " Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.917685 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f92d5d-ba87-4de0-955d-998845ea9010-logs" (OuterVolumeSpecName: "logs") pod "96f92d5d-ba87-4de0-955d-998845ea9010" (UID: "96f92d5d-ba87-4de0-955d-998845ea9010"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.924412 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f92d5d-ba87-4de0-955d-998845ea9010-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "96f92d5d-ba87-4de0-955d-998845ea9010" (UID: "96f92d5d-ba87-4de0-955d-998845ea9010"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.925215 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dcfda15-b815-4733-b3c7-0312473f7355-logs" (OuterVolumeSpecName: "logs") pod "4dcfda15-b815-4733-b3c7-0312473f7355" (UID: "4dcfda15-b815-4733-b3c7-0312473f7355"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.933884 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f92d5d-ba87-4de0-955d-998845ea9010-kube-api-access-ptk7m" (OuterVolumeSpecName: "kube-api-access-ptk7m") pod "96f92d5d-ba87-4de0-955d-998845ea9010" (UID: "96f92d5d-ba87-4de0-955d-998845ea9010"). InnerVolumeSpecName "kube-api-access-ptk7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.954500 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcfda15-b815-4733-b3c7-0312473f7355-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4dcfda15-b815-4733-b3c7-0312473f7355" (UID: "4dcfda15-b815-4733-b3c7-0312473f7355"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.956193 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dcfda15-b815-4733-b3c7-0312473f7355-kube-api-access-6lpm6" (OuterVolumeSpecName: "kube-api-access-6lpm6") pod "4dcfda15-b815-4733-b3c7-0312473f7355" (UID: "4dcfda15-b815-4733-b3c7-0312473f7355"). InnerVolumeSpecName "kube-api-access-6lpm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:48 crc kubenswrapper[4672]: I0930 12:39:48.982249 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-config-data" (OuterVolumeSpecName: "config-data") pod "4dcfda15-b815-4733-b3c7-0312473f7355" (UID: "4dcfda15-b815-4733-b3c7-0312473f7355"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.018536 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptk7m\" (UniqueName: \"kubernetes.io/projected/96f92d5d-ba87-4de0-955d-998845ea9010-kube-api-access-ptk7m\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.019125 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dcfda15-b815-4733-b3c7-0312473f7355-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.019239 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dcfda15-b815-4733-b3c7-0312473f7355-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.020480 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lpm6\" (UniqueName: \"kubernetes.io/projected/4dcfda15-b815-4733-b3c7-0312473f7355-kube-api-access-6lpm6\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.020694 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96f92d5d-ba87-4de0-955d-998845ea9010-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.020945 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96f92d5d-ba87-4de0-955d-998845ea9010-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.021067 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.044748 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8679776b6d-5ltf9"] Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.067962 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-scripts" (OuterVolumeSpecName: "scripts") pod "4dcfda15-b815-4733-b3c7-0312473f7355" (UID: "4dcfda15-b815-4733-b3c7-0312473f7355"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.090954 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-config-data" (OuterVolumeSpecName: "config-data") pod "96f92d5d-ba87-4de0-955d-998845ea9010" (UID: "96f92d5d-ba87-4de0-955d-998845ea9010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.104016 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-scripts" (OuterVolumeSpecName: "scripts") pod "96f92d5d-ba87-4de0-955d-998845ea9010" (UID: "96f92d5d-ba87-4de0-955d-998845ea9010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.123462 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.123500 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96f92d5d-ba87-4de0-955d-998845ea9010-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.123509 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dcfda15-b815-4733-b3c7-0312473f7355-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.249156 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" event={"ID":"7212b048-8992-4678-b985-05a5c1fc8818","Type":"ContainerStarted","Data":"f02c3765b7f7ae11f1cf0e9bae8c6e209f3239e1c88ce9a32b842d99bab90b99"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.252746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" event={"ID":"e338321e-04aa-4ed3-8b3c-0baf6888f64f","Type":"ContainerStarted","Data":"4671c529d0a2914ed70dc8fa44d4023d55a217a7350e081c82cdb450ea136c41"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.254759 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f749887b9-hl4rd" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.254748 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f749887b9-hl4rd" event={"ID":"4dcfda15-b815-4733-b3c7-0312473f7355","Type":"ContainerDied","Data":"c1064bedad9058b66ba7ec1d2dc92c472f49a11e5389258162c6744694c866f6"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.254889 4672 scope.go:117] "RemoveContainer" containerID="2ca40cc4750bf50f77b8f00e761b61a2c75ac13af658d7a70bb22c83ab627d31" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.259257 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8679776b6d-5ltf9" event={"ID":"91b5c532-f855-46f2-967b-c53fc7f0ffee","Type":"ContainerStarted","Data":"13f071aa67adefe4e58df5faa85faf1804fffe7f9dd73acf07f6835d333ade1d"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.265220 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" event={"ID":"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2","Type":"ContainerStarted","Data":"01a1f00a22e6646c290e3d7cdfdee8d19cede08cc01e5f7b16f4ea5b8aebb2b4"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.269252 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567d7b57-5g9zw" event={"ID":"96f92d5d-ba87-4de0-955d-998845ea9010","Type":"ContainerDied","Data":"80e78aba304d365dd4159b7a468e860978b870437cdfa0fdc8b8f1b55919c926"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.269415 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-567d7b57-5g9zw" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.272019 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548589fd89-v67ld" event={"ID":"8dd596d1-129e-4bb2-9e9b-dac1d09323d2","Type":"ContainerDied","Data":"cbabc5d93198a8af257188261dd554e49bb4683b41e53b0d825d378f3a35bd21"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.272248 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548589fd89-v67ld" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.281600 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b77669d6-hlcjq" event={"ID":"1834109c-113d-4231-94e6-0796ef06015d","Type":"ContainerStarted","Data":"bf4778b7a301b099dc3fff9cdd77bc5c678bd387cb8d5e72fcbc8a905ca58f91"} Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.354706 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b77669d6-hlcjq" podStartSLOduration=3.354684636 podStartE2EDuration="3.354684636s" podCreationTimestamp="2025-09-30 12:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:49.322977097 +0000 UTC m=+1080.592214743" watchObservedRunningTime="2025-09-30 12:39:49.354684636 +0000 UTC m=+1080.623922282" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.358382 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f749887b9-hl4rd"] Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.397599 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f749887b9-hl4rd"] Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.494369 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" path="/var/lib/kubelet/pods/4dcfda15-b815-4733-b3c7-0312473f7355/volumes" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.495269 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-548589fd89-v67ld"] Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.501071 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-548589fd89-v67ld"] Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.542602 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.568669 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.572939 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-567d7b57-5g9zw"] Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.603183 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-567d7b57-5g9zw"] Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.752397 4672 scope.go:117] "RemoveContainer" containerID="501b2136d15d3cf240eff2b922fac22c611eb5f0c9b5a54fd75a073a669049c1" Sep 30 12:39:49 crc kubenswrapper[4672]: I0930 12:39:49.877795 4672 scope.go:117] "RemoveContainer" containerID="586c3bebe202fe92c0504e86b49cf04794ba5b560a39275854f3a1033b6a055d" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.030987 4672 scope.go:117] "RemoveContainer" containerID="6fe46b3cd368d55265ed69ecd7caa903a2d78380fec92da12e77686aa3dddc3a" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.320368 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8679776b6d-5ltf9" event={"ID":"91b5c532-f855-46f2-967b-c53fc7f0ffee","Type":"ContainerStarted","Data":"942b2d771116a4d5f33d53d0e4d71b134ec103c8d95b81a353d7feec16663451"} Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.320671 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8679776b6d-5ltf9" event={"ID":"91b5c532-f855-46f2-967b-c53fc7f0ffee","Type":"ContainerStarted","Data":"a7421a9ec7c3523789edfcf1025f60a25c066b03056cb50d0c1e11f088795c32"} Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.321046 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.321092 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.334836 4672 generic.go:334] "Generic (PLEG): container finished" podID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerID="f47c4a088df60f928a4b40e38da0134120f0babe26ab003bd0faa04e95716642" exitCode=0 Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.334978 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" event={"ID":"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2","Type":"ContainerDied","Data":"f47c4a088df60f928a4b40e38da0134120f0babe26ab003bd0faa04e95716642"} Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.349213 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8679776b6d-5ltf9" podStartSLOduration=3.349194626 podStartE2EDuration="3.349194626s" podCreationTimestamp="2025-09-30 12:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:50.341540441 +0000 UTC m=+1081.610778087" watchObservedRunningTime="2025-09-30 12:39:50.349194626 +0000 UTC m=+1081.618432272" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.358227 4672 generic.go:334] "Generic (PLEG): container finished" podID="4f1bee84-650b-4f0b-a657-e6701ee51823" containerID="50db62107d03a042bff857d072808d9278376d38aa180a36c8e188d5d240b6c0" exitCode=1 Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.358361 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerDied","Data":"50db62107d03a042bff857d072808d9278376d38aa180a36c8e188d5d240b6c0"} Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.358429 4672 scope.go:117] "RemoveContainer" containerID="0cd2bc9ec87f2b1fac46321bc8d94fa135c803ac11bdb697cd95415925106220" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.359153 4672 scope.go:117] "RemoveContainer" containerID="50db62107d03a042bff857d072808d9278376d38aa180a36c8e188d5d240b6c0" Sep 30 12:39:50 crc kubenswrapper[4672]: E0930 12:39:50.359576 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(4f1bee84-650b-4f0b-a657-e6701ee51823)\"" pod="openstack/watcher-decision-engine-0" podUID="4f1bee84-650b-4f0b-a657-e6701ee51823" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.364039 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.364324 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:39:50 crc kubenswrapper[4672]: I0930 12:39:50.397941 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.143994 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c888d4d9d-4nr45"] Sep 30 12:39:51 crc kubenswrapper[4672]: E0930 12:39:51.152134 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.152222 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: E0930 12:39:51.152298 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.152348 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: E0930 12:39:51.152400 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.152457 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: E0930 12:39:51.152510 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.152557 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: E0930 12:39:51.152612 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.152661 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: E0930 12:39:51.152719 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.152771 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.152993 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.153047 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.153097 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.153149 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dcfda15-b815-4733-b3c7-0312473f7355" containerName="horizon" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.153197 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.153251 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" containerName="horizon-log" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.154335 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.166567 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.166849 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.203633 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c888d4d9d-4nr45"] Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.288490 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-combined-ca-bundle\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.288549 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-config-data\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.288583 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-public-tls-certs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.288684 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4s6m\" (UniqueName: \"kubernetes.io/projected/7b67355f-8081-4014-ad68-6e31faa794b1-kube-api-access-z4s6m\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.288761 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-internal-tls-certs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.288789 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-config-data-custom\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.288818 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b67355f-8081-4014-ad68-6e31faa794b1-logs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.390211 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4s6m\" (UniqueName: \"kubernetes.io/projected/7b67355f-8081-4014-ad68-6e31faa794b1-kube-api-access-z4s6m\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.390286 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-internal-tls-certs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.390311 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-config-data-custom\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.390334 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b67355f-8081-4014-ad68-6e31faa794b1-logs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.390412 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-combined-ca-bundle\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.390430 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-config-data\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.390469 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-public-tls-certs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.393089 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b67355f-8081-4014-ad68-6e31faa794b1-logs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.399314 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-combined-ca-bundle\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.399956 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-internal-tls-certs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.409844 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-config-data-custom\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.411919 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-config-data\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.416690 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b67355f-8081-4014-ad68-6e31faa794b1-public-tls-certs\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.429078 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4s6m\" (UniqueName: \"kubernetes.io/projected/7b67355f-8081-4014-ad68-6e31faa794b1-kube-api-access-z4s6m\") pod \"barbican-api-7c888d4d9d-4nr45\" (UID: \"7b67355f-8081-4014-ad68-6e31faa794b1\") " pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.465744 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd596d1-129e-4bb2-9e9b-dac1d09323d2" path="/var/lib/kubelet/pods/8dd596d1-129e-4bb2-9e9b-dac1d09323d2/volumes" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.466704 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f92d5d-ba87-4de0-955d-998845ea9010" path="/var/lib/kubelet/pods/96f92d5d-ba87-4de0-955d-998845ea9010/volumes" Sep 30 12:39:51 crc kubenswrapper[4672]: I0930 12:39:51.505773 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:52 crc kubenswrapper[4672]: I0930 12:39:52.449545 4672 generic.go:334] "Generic (PLEG): container finished" podID="db83994c-a577-4d20-a544-3950abb7273b" containerID="2ff2e0394a73d16946196cb54745bc30f70ac4238c5fc730f22b5d9ec507751f" exitCode=0 Sep 30 12:39:52 crc kubenswrapper[4672]: I0930 12:39:52.450374 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rd9t8" event={"ID":"db83994c-a577-4d20-a544-3950abb7273b","Type":"ContainerDied","Data":"2ff2e0394a73d16946196cb54745bc30f70ac4238c5fc730f22b5d9ec507751f"} Sep 30 12:39:52 crc kubenswrapper[4672]: I0930 12:39:52.916330 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c888d4d9d-4nr45"] Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.397537 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.397659 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.409963 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.445113 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.445174 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.468562 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" event={"ID":"7212b048-8992-4678-b985-05a5c1fc8818","Type":"ContainerStarted","Data":"27140d7afd7ed240551dfc498b389a86652c9797b9857b1c0bf50084885d3a12"} Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.478586 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" event={"ID":"e338321e-04aa-4ed3-8b3c-0baf6888f64f","Type":"ContainerStarted","Data":"2efbee88f951035290e58deb9cf85b6ad4ccad94df1e347bda90306f4ff93cbf"} Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.491416 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" event={"ID":"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2","Type":"ContainerStarted","Data":"a2dae8ca75c42aaca8cdefa9f55d04972e24db9dbd29681732d67195693be198"} Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.491566 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:53 crc kubenswrapper[4672]: I0930 12:39:53.518483 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" podStartSLOduration=6.51846621 podStartE2EDuration="6.51846621s" podCreationTimestamp="2025-09-30 12:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:53.515901624 +0000 UTC m=+1084.785139270" watchObservedRunningTime="2025-09-30 12:39:53.51846621 +0000 UTC m=+1084.787703856" Sep 30 12:39:55 crc kubenswrapper[4672]: I0930 12:39:55.382068 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:39:55 crc kubenswrapper[4672]: I0930 12:39:55.382427 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:39:55 crc kubenswrapper[4672]: I0930 12:39:55.383788 4672 scope.go:117] "RemoveContainer" containerID="50db62107d03a042bff857d072808d9278376d38aa180a36c8e188d5d240b6c0" Sep 30 12:39:55 crc kubenswrapper[4672]: E0930 12:39:55.384078 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(4f1bee84-650b-4f0b-a657-e6701ee51823)\"" pod="openstack/watcher-decision-engine-0" podUID="4f1bee84-650b-4f0b-a657-e6701ee51823" Sep 30 12:39:56 crc kubenswrapper[4672]: W0930 12:39:56.497485 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b67355f_8081_4014_ad68_6e31faa794b1.slice/crio-f15a84c7efaa7659224342114814b8aa5b1c280d8b22b63fec7d6ebf4dcf0133 WatchSource:0}: Error finding container f15a84c7efaa7659224342114814b8aa5b1c280d8b22b63fec7d6ebf4dcf0133: Status 404 returned error can't find the container with id f15a84c7efaa7659224342114814b8aa5b1c280d8b22b63fec7d6ebf4dcf0133 Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.523932 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c888d4d9d-4nr45" event={"ID":"7b67355f-8081-4014-ad68-6e31faa794b1","Type":"ContainerStarted","Data":"f15a84c7efaa7659224342114814b8aa5b1c280d8b22b63fec7d6ebf4dcf0133"} Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.525468 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rd9t8" event={"ID":"db83994c-a577-4d20-a544-3950abb7273b","Type":"ContainerDied","Data":"37b877bf0bb61bf00bbd38e8ea18b6d0da92c4aee9595c3e5d776fcecd51cf63"} Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.525496 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b877bf0bb61bf00bbd38e8ea18b6d0da92c4aee9595c3e5d776fcecd51cf63" Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.592816 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.711575 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49b65\" (UniqueName: \"kubernetes.io/projected/db83994c-a577-4d20-a544-3950abb7273b-kube-api-access-49b65\") pod \"db83994c-a577-4d20-a544-3950abb7273b\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.711788 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-combined-ca-bundle\") pod \"db83994c-a577-4d20-a544-3950abb7273b\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.711893 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-config\") pod \"db83994c-a577-4d20-a544-3950abb7273b\" (UID: \"db83994c-a577-4d20-a544-3950abb7273b\") " Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.738304 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db83994c-a577-4d20-a544-3950abb7273b-kube-api-access-49b65" (OuterVolumeSpecName: "kube-api-access-49b65") pod "db83994c-a577-4d20-a544-3950abb7273b" (UID: "db83994c-a577-4d20-a544-3950abb7273b"). InnerVolumeSpecName "kube-api-access-49b65". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.748452 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db83994c-a577-4d20-a544-3950abb7273b" (UID: "db83994c-a577-4d20-a544-3950abb7273b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.770515 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-config" (OuterVolumeSpecName: "config") pod "db83994c-a577-4d20-a544-3950abb7273b" (UID: "db83994c-a577-4d20-a544-3950abb7273b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.813906 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49b65\" (UniqueName: \"kubernetes.io/projected/db83994c-a577-4d20-a544-3950abb7273b-kube-api-access-49b65\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.813938 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:56 crc kubenswrapper[4672]: I0930 12:39:56.813948 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/db83994c-a577-4d20-a544-3950abb7273b-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.562747 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" event={"ID":"e338321e-04aa-4ed3-8b3c-0baf6888f64f","Type":"ContainerStarted","Data":"ece1e95525f2a774b472e1f448f723fe7ddd9963a8e08f90254cdf6f32cfa24c"} Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.570693 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" event={"ID":"7212b048-8992-4678-b985-05a5c1fc8818","Type":"ContainerStarted","Data":"6b12ef24565a87caf9c1824f7f958a491d5a68489d0189f83ad5d7917c1b6a65"} Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.585981 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6bd7cdf79c-ddf9q" podStartSLOduration=6.816121459 podStartE2EDuration="10.585964976s" podCreationTimestamp="2025-09-30 12:39:47 +0000 UTC" firstStartedPulling="2025-09-30 12:39:48.687449211 +0000 UTC m=+1079.956686857" lastFinishedPulling="2025-09-30 12:39:52.457292708 +0000 UTC m=+1083.726530374" observedRunningTime="2025-09-30 12:39:57.581653586 +0000 UTC m=+1088.850891232" watchObservedRunningTime="2025-09-30 12:39:57.585964976 +0000 UTC m=+1088.855202622" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.607153 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-654fbcfdf6-vvhwm" podStartSLOduration=7.080994122 podStartE2EDuration="10.607139127s" podCreationTimestamp="2025-09-30 12:39:47 +0000 UTC" firstStartedPulling="2025-09-30 12:39:48.895491673 +0000 UTC m=+1080.164729319" lastFinishedPulling="2025-09-30 12:39:52.421636678 +0000 UTC m=+1083.690874324" observedRunningTime="2025-09-30 12:39:57.605726171 +0000 UTC m=+1088.874963817" watchObservedRunningTime="2025-09-30 12:39:57.607139127 +0000 UTC m=+1088.876376773" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.622617 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerStarted","Data":"fe79040cae28acda1b62a728289d5dd65a43f9a989a447ed3b717c6d2b506d93"} Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.622836 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-central-agent" containerID="cri-o://af6331668bdba254199db134ae8885d681bf81141ac6733dbfa4d255be8e9c0e" gracePeriod=30 Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.623140 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.623500 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="proxy-httpd" containerID="cri-o://fe79040cae28acda1b62a728289d5dd65a43f9a989a447ed3b717c6d2b506d93" gracePeriod=30 Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.623567 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="sg-core" containerID="cri-o://aff9da5c297f27e5f62f1e7cceb06e20b320e3340926ac4477a12a95bb4bf9ed" gracePeriod=30 Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.623613 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-notification-agent" containerID="cri-o://8d223d3a6e9db292e6fb7098fd6f146e3fc9591954560d7a29c4a778db39c109" gracePeriod=30 Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.628620 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rd9t8" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.629100 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c888d4d9d-4nr45" event={"ID":"7b67355f-8081-4014-ad68-6e31faa794b1","Type":"ContainerStarted","Data":"48b48ddf93dc5b45bdb25e9ad468ea7dce95030c4102beacefb0bf36a6e0cd1c"} Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.784355 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.880885 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.717981382 podStartE2EDuration="1m2.880859635s" podCreationTimestamp="2025-09-30 12:38:55 +0000 UTC" firstStartedPulling="2025-09-30 12:38:57.037617742 +0000 UTC m=+1028.306855398" lastFinishedPulling="2025-09-30 12:39:57.200495995 +0000 UTC m=+1088.469733651" observedRunningTime="2025-09-30 12:39:57.6754333 +0000 UTC m=+1088.944670956" watchObservedRunningTime="2025-09-30 12:39:57.880859635 +0000 UTC m=+1089.150097281" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.932343 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc454b69-wncsb"] Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.932656 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" podUID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerName="dnsmasq-dns" containerID="cri-o://8693125bc289223634c777a33aeafbcd9eb9a16affbcccfb8d0b7915623cce33" gracePeriod=10 Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.972658 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85554c85d5-pkzn2"] Sep 30 12:39:57 crc kubenswrapper[4672]: E0930 12:39:57.973059 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db83994c-a577-4d20-a544-3950abb7273b" containerName="neutron-db-sync" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.973071 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="db83994c-a577-4d20-a544-3950abb7273b" containerName="neutron-db-sync" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.973313 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="db83994c-a577-4d20-a544-3950abb7273b" containerName="neutron-db-sync" Sep 30 12:39:57 crc kubenswrapper[4672]: I0930 12:39:57.974292 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.063777 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85554c85d5-pkzn2"] Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.085595 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-swift-storage-0\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.085684 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-config\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.085753 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-svc\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.085778 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57m4\" (UniqueName: \"kubernetes.io/projected/7998babe-964f-4ea9-a7a5-c11d5c7a6912-kube-api-access-z57m4\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.085817 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-sb\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.085876 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-nb\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.169334 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56684fbfb-t69x4"] Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.171017 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.190872 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-swift-storage-0\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.190904 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xnjsm" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.190935 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-config\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.190996 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-svc\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.191027 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57m4\" (UniqueName: \"kubernetes.io/projected/7998babe-964f-4ea9-a7a5-c11d5c7a6912-kube-api-access-z57m4\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.191056 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-sb\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.191105 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-nb\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.191192 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.191358 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.191798 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.192253 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-nb\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.192999 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-svc\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.194156 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-config\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.198618 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-swift-storage-0\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.202839 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-sb\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.215011 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56684fbfb-t69x4"] Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.280012 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57m4\" (UniqueName: \"kubernetes.io/projected/7998babe-964f-4ea9-a7a5-c11d5c7a6912-kube-api-access-z57m4\") pod \"dnsmasq-dns-85554c85d5-pkzn2\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.294313 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-config\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.294367 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-httpd-config\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.294429 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctx4\" (UniqueName: \"kubernetes.io/projected/6d32cb40-920a-4b27-bf27-9362601aabae-kube-api-access-rctx4\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.294463 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-ovndb-tls-certs\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.294504 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-combined-ca-bundle\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.399256 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-combined-ca-bundle\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.399371 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-config\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.399399 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-httpd-config\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.399459 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rctx4\" (UniqueName: \"kubernetes.io/projected/6d32cb40-920a-4b27-bf27-9362601aabae-kube-api-access-rctx4\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.399490 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-ovndb-tls-certs\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.413213 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-config\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.413633 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.413752 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-httpd-config\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.414537 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-ovndb-tls-certs\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.428340 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-combined-ca-bundle\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.445964 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rctx4\" (UniqueName: \"kubernetes.io/projected/6d32cb40-920a-4b27-bf27-9362601aabae-kube-api-access-rctx4\") pod \"neutron-56684fbfb-t69x4\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.540484 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.738872 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-llp7f" event={"ID":"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a","Type":"ContainerStarted","Data":"c9b8ac0d2c2944aef03ffe7f96160d4edab5aa59318146e34fd05f492a680c0d"} Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.783467 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-llp7f" podStartSLOduration=15.192623706 podStartE2EDuration="54.783445369s" podCreationTimestamp="2025-09-30 12:39:04 +0000 UTC" firstStartedPulling="2025-09-30 12:39:17.582928689 +0000 UTC m=+1048.852166335" lastFinishedPulling="2025-09-30 12:39:57.173750352 +0000 UTC m=+1088.442987998" observedRunningTime="2025-09-30 12:39:58.771681779 +0000 UTC m=+1090.040919445" watchObservedRunningTime="2025-09-30 12:39:58.783445369 +0000 UTC m=+1090.052683015" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.784725 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c888d4d9d-4nr45" event={"ID":"7b67355f-8081-4014-ad68-6e31faa794b1","Type":"ContainerStarted","Data":"1d425c910181010bffd512fb8d3cac177a40082423212451d45be1a42949d274"} Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.785898 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.785921 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.859556 4672 generic.go:334] "Generic (PLEG): container finished" podID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerID="fe79040cae28acda1b62a728289d5dd65a43f9a989a447ed3b717c6d2b506d93" exitCode=0 Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.859595 4672 generic.go:334] "Generic (PLEG): container finished" podID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerID="aff9da5c297f27e5f62f1e7cceb06e20b320e3340926ac4477a12a95bb4bf9ed" exitCode=2 Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.859678 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerDied","Data":"fe79040cae28acda1b62a728289d5dd65a43f9a989a447ed3b717c6d2b506d93"} Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.859704 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerDied","Data":"aff9da5c297f27e5f62f1e7cceb06e20b320e3340926ac4477a12a95bb4bf9ed"} Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.885159 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c888d4d9d-4nr45" podStartSLOduration=7.885134425 podStartE2EDuration="7.885134425s" podCreationTimestamp="2025-09-30 12:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:39:58.827061612 +0000 UTC m=+1090.096299258" watchObservedRunningTime="2025-09-30 12:39:58.885134425 +0000 UTC m=+1090.154372071" Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.923728 4672 generic.go:334] "Generic (PLEG): container finished" podID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerID="8693125bc289223634c777a33aeafbcd9eb9a16affbcccfb8d0b7915623cce33" exitCode=0 Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.924824 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" event={"ID":"984ee7c7-9f78-45ea-876e-82c967e7c4fc","Type":"ContainerDied","Data":"8693125bc289223634c777a33aeafbcd9eb9a16affbcccfb8d0b7915623cce33"} Sep 30 12:39:58 crc kubenswrapper[4672]: I0930 12:39:58.973661 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.072027 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-config\") pod \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.072073 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-sb\") pod \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.072135 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-swift-storage-0\") pod \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.072298 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mpj4\" (UniqueName: \"kubernetes.io/projected/984ee7c7-9f78-45ea-876e-82c967e7c4fc-kube-api-access-8mpj4\") pod \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.072340 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-nb\") pod \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.072430 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-svc\") pod \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\" (UID: \"984ee7c7-9f78-45ea-876e-82c967e7c4fc\") " Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.094307 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984ee7c7-9f78-45ea-876e-82c967e7c4fc-kube-api-access-8mpj4" (OuterVolumeSpecName: "kube-api-access-8mpj4") pod "984ee7c7-9f78-45ea-876e-82c967e7c4fc" (UID: "984ee7c7-9f78-45ea-876e-82c967e7c4fc"). InnerVolumeSpecName "kube-api-access-8mpj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.174771 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mpj4\" (UniqueName: \"kubernetes.io/projected/984ee7c7-9f78-45ea-876e-82c967e7c4fc-kube-api-access-8mpj4\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.209255 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "984ee7c7-9f78-45ea-876e-82c967e7c4fc" (UID: "984ee7c7-9f78-45ea-876e-82c967e7c4fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.222647 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-config" (OuterVolumeSpecName: "config") pod "984ee7c7-9f78-45ea-876e-82c967e7c4fc" (UID: "984ee7c7-9f78-45ea-876e-82c967e7c4fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.245387 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "984ee7c7-9f78-45ea-876e-82c967e7c4fc" (UID: "984ee7c7-9f78-45ea-876e-82c967e7c4fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.258065 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85554c85d5-pkzn2"] Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.262347 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "984ee7c7-9f78-45ea-876e-82c967e7c4fc" (UID: "984ee7c7-9f78-45ea-876e-82c967e7c4fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.264318 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "984ee7c7-9f78-45ea-876e-82c967e7c4fc" (UID: "984ee7c7-9f78-45ea-876e-82c967e7c4fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.276570 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.276609 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.276618 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.276627 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.276639 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/984ee7c7-9f78-45ea-876e-82c967e7c4fc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.508934 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56684fbfb-t69x4"] Sep 30 12:39:59 crc kubenswrapper[4672]: W0930 12:39:59.542986 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d32cb40_920a_4b27_bf27_9362601aabae.slice/crio-9b938eb24d9bb4f2ed2f5af361cd3c2bab6517eec7201fa60470285c1ac7126a WatchSource:0}: Error finding container 9b938eb24d9bb4f2ed2f5af361cd3c2bab6517eec7201fa60470285c1ac7126a: Status 404 returned error can't find the container with id 9b938eb24d9bb4f2ed2f5af361cd3c2bab6517eec7201fa60470285c1ac7126a Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.943660 4672 generic.go:334] "Generic (PLEG): container finished" podID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerID="3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87" exitCode=0 Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.943737 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" event={"ID":"7998babe-964f-4ea9-a7a5-c11d5c7a6912","Type":"ContainerDied","Data":"3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87"} Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.944136 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" event={"ID":"7998babe-964f-4ea9-a7a5-c11d5c7a6912","Type":"ContainerStarted","Data":"4d5d624a895f9e3589f21d22931a44a6d88d30b8020ea734e35c87d88afcd311"} Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.955345 4672 generic.go:334] "Generic (PLEG): container finished" podID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerID="af6331668bdba254199db134ae8885d681bf81141ac6733dbfa4d255be8e9c0e" exitCode=0 Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.955422 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerDied","Data":"af6331668bdba254199db134ae8885d681bf81141ac6733dbfa4d255be8e9c0e"} Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.964645 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56684fbfb-t69x4" event={"ID":"6d32cb40-920a-4b27-bf27-9362601aabae","Type":"ContainerStarted","Data":"9b938eb24d9bb4f2ed2f5af361cd3c2bab6517eec7201fa60470285c1ac7126a"} Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.983129 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" event={"ID":"984ee7c7-9f78-45ea-876e-82c967e7c4fc","Type":"ContainerDied","Data":"40c1f7df4db717bc0d14d8a9378d6da9d0c8039cca40c36dade8a689226d98a3"} Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.983189 4672 scope.go:117] "RemoveContainer" containerID="8693125bc289223634c777a33aeafbcd9eb9a16affbcccfb8d0b7915623cce33" Sep 30 12:39:59 crc kubenswrapper[4672]: I0930 12:39:59.983215 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc454b69-wncsb" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.037356 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc454b69-wncsb"] Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.071964 4672 scope.go:117] "RemoveContainer" containerID="5ca1a950e67c2b92179249f2326b865b66d17b8817b63536e97df2ac1c3f9297" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.080128 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc454b69-wncsb"] Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.965708 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b7c4888f-v7vn2"] Sep 30 12:40:00 crc kubenswrapper[4672]: E0930 12:40:00.967071 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerName="dnsmasq-dns" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.967097 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerName="dnsmasq-dns" Sep 30 12:40:00 crc kubenswrapper[4672]: E0930 12:40:00.967174 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerName="init" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.967185 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerName="init" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.967483 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" containerName="dnsmasq-dns" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.969085 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.977111 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.985095 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.987641 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7c4888f-v7vn2"] Sep 30 12:40:00 crc kubenswrapper[4672]: I0930 12:40:00.990898 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56684fbfb-t69x4" event={"ID":"6d32cb40-920a-4b27-bf27-9362601aabae","Type":"ContainerStarted","Data":"15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab"} Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.123763 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-public-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.124059 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nflp\" (UniqueName: \"kubernetes.io/projected/f394ad91-f6fb-4d7a-8508-d8fede494686-kube-api-access-7nflp\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.125057 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-internal-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.125123 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-httpd-config\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.125181 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-combined-ca-bundle\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.125248 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-config\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.125352 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-ovndb-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.139138 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.227204 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-httpd-config\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.227556 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-combined-ca-bundle\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.227673 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-config\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.227806 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-ovndb-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.227912 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-public-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.227994 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nflp\" (UniqueName: \"kubernetes.io/projected/f394ad91-f6fb-4d7a-8508-d8fede494686-kube-api-access-7nflp\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.228074 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-internal-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.234906 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-internal-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.237123 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-ovndb-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.238896 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-combined-ca-bundle\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.242856 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-httpd-config\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.243426 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-config\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.254974 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nflp\" (UniqueName: \"kubernetes.io/projected/f394ad91-f6fb-4d7a-8508-d8fede494686-kube-api-access-7nflp\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.258377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f394ad91-f6fb-4d7a-8508-d8fede494686-public-tls-certs\") pod \"neutron-6b7c4888f-v7vn2\" (UID: \"f394ad91-f6fb-4d7a-8508-d8fede494686\") " pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.285994 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.292648 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:40:01 crc kubenswrapper[4672]: I0930 12:40:01.434465 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984ee7c7-9f78-45ea-876e-82c967e7c4fc" path="/var/lib/kubelet/pods/984ee7c7-9f78-45ea-876e-82c967e7c4fc/volumes" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.011728 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7c4888f-v7vn2"] Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.024569 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" event={"ID":"7998babe-964f-4ea9-a7a5-c11d5c7a6912","Type":"ContainerStarted","Data":"4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32"} Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.026399 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.036666 4672 generic.go:334] "Generic (PLEG): container finished" podID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerID="8d223d3a6e9db292e6fb7098fd6f146e3fc9591954560d7a29c4a778db39c109" exitCode=0 Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.036734 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerDied","Data":"8d223d3a6e9db292e6fb7098fd6f146e3fc9591954560d7a29c4a778db39c109"} Sep 30 12:40:02 crc kubenswrapper[4672]: W0930 12:40:02.050120 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf394ad91_f6fb_4d7a_8508_d8fede494686.slice/crio-bed41c6d88011f80ab3e7d4db4556c94c616518beb24251966a758be4cd6aa4f WatchSource:0}: Error finding container bed41c6d88011f80ab3e7d4db4556c94c616518beb24251966a758be4cd6aa4f: Status 404 returned error can't find the container with id bed41c6d88011f80ab3e7d4db4556c94c616518beb24251966a758be4cd6aa4f Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.066193 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" podStartSLOduration=5.06617449 podStartE2EDuration="5.06617449s" podCreationTimestamp="2025-09-30 12:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:02.058161485 +0000 UTC m=+1093.327399131" watchObservedRunningTime="2025-09-30 12:40:02.06617449 +0000 UTC m=+1093.335412136" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.282712 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.366248 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-sg-core-conf-yaml\") pod \"a813f7c2-e727-42a0-8779-8619fe6e8165\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.366409 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-config-data\") pod \"a813f7c2-e727-42a0-8779-8619fe6e8165\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.366480 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-run-httpd\") pod \"a813f7c2-e727-42a0-8779-8619fe6e8165\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.366520 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r2qw\" (UniqueName: \"kubernetes.io/projected/a813f7c2-e727-42a0-8779-8619fe6e8165-kube-api-access-2r2qw\") pod \"a813f7c2-e727-42a0-8779-8619fe6e8165\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.366547 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-log-httpd\") pod \"a813f7c2-e727-42a0-8779-8619fe6e8165\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.366573 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-scripts\") pod \"a813f7c2-e727-42a0-8779-8619fe6e8165\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.366623 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-combined-ca-bundle\") pod \"a813f7c2-e727-42a0-8779-8619fe6e8165\" (UID: \"a813f7c2-e727-42a0-8779-8619fe6e8165\") " Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.373663 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a813f7c2-e727-42a0-8779-8619fe6e8165" (UID: "a813f7c2-e727-42a0-8779-8619fe6e8165"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.380519 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-scripts" (OuterVolumeSpecName: "scripts") pod "a813f7c2-e727-42a0-8779-8619fe6e8165" (UID: "a813f7c2-e727-42a0-8779-8619fe6e8165"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.380525 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a813f7c2-e727-42a0-8779-8619fe6e8165-kube-api-access-2r2qw" (OuterVolumeSpecName: "kube-api-access-2r2qw") pod "a813f7c2-e727-42a0-8779-8619fe6e8165" (UID: "a813f7c2-e727-42a0-8779-8619fe6e8165"). InnerVolumeSpecName "kube-api-access-2r2qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.387381 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a813f7c2-e727-42a0-8779-8619fe6e8165" (UID: "a813f7c2-e727-42a0-8779-8619fe6e8165"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.405388 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a813f7c2-e727-42a0-8779-8619fe6e8165" (UID: "a813f7c2-e727-42a0-8779-8619fe6e8165"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.455475 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a813f7c2-e727-42a0-8779-8619fe6e8165" (UID: "a813f7c2-e727-42a0-8779-8619fe6e8165"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.464910 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.469258 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.472533 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.472570 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r2qw\" (UniqueName: \"kubernetes.io/projected/a813f7c2-e727-42a0-8779-8619fe6e8165-kube-api-access-2r2qw\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.472583 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a813f7c2-e727-42a0-8779-8619fe6e8165-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.472618 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.472632 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.505623 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-config-data" (OuterVolumeSpecName: "config-data") pod "a813f7c2-e727-42a0-8779-8619fe6e8165" (UID: "a813f7c2-e727-42a0-8779-8619fe6e8165"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:02 crc kubenswrapper[4672]: I0930 12:40:02.574458 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a813f7c2-e727-42a0-8779-8619fe6e8165-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.048677 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.048666 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a813f7c2-e727-42a0-8779-8619fe6e8165","Type":"ContainerDied","Data":"3e73427b7aa04fefba3f1788211511545670e1e6c8065a95bfc261c48926476f"} Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.048833 4672 scope.go:117] "RemoveContainer" containerID="fe79040cae28acda1b62a728289d5dd65a43f9a989a447ed3b717c6d2b506d93" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.053364 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56684fbfb-t69x4" event={"ID":"6d32cb40-920a-4b27-bf27-9362601aabae","Type":"ContainerStarted","Data":"eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0"} Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.054427 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.080298 4672 scope.go:117] "RemoveContainer" containerID="aff9da5c297f27e5f62f1e7cceb06e20b320e3340926ac4477a12a95bb4bf9ed" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.084594 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7c4888f-v7vn2" event={"ID":"f394ad91-f6fb-4d7a-8508-d8fede494686","Type":"ContainerStarted","Data":"f32b3f0fa7bb1a89bd3651c3c2cd01f156635b6c1250c5c500633451de975cc2"} Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.084635 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7c4888f-v7vn2" event={"ID":"f394ad91-f6fb-4d7a-8508-d8fede494686","Type":"ContainerStarted","Data":"22be049df510f8695a56ea7438e11c205245703fe27814783f95205714459af0"} Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.084648 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7c4888f-v7vn2" event={"ID":"f394ad91-f6fb-4d7a-8508-d8fede494686","Type":"ContainerStarted","Data":"bed41c6d88011f80ab3e7d4db4556c94c616518beb24251966a758be4cd6aa4f"} Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.085479 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.106631 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56684fbfb-t69x4" podStartSLOduration=5.106607623 podStartE2EDuration="5.106607623s" podCreationTimestamp="2025-09-30 12:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:03.099149973 +0000 UTC m=+1094.368387619" watchObservedRunningTime="2025-09-30 12:40:03.106607623 +0000 UTC m=+1094.375845269" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.116392 4672 scope.go:117] "RemoveContainer" containerID="8d223d3a6e9db292e6fb7098fd6f146e3fc9591954560d7a29c4a778db39c109" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.128581 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.134552 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.156802 4672 scope.go:117] "RemoveContainer" containerID="af6331668bdba254199db134ae8885d681bf81141ac6733dbfa4d255be8e9c0e" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.160329 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:03 crc kubenswrapper[4672]: E0930 12:40:03.160796 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-notification-agent" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.160822 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-notification-agent" Sep 30 12:40:03 crc kubenswrapper[4672]: E0930 12:40:03.160842 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-central-agent" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.160851 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-central-agent" Sep 30 12:40:03 crc kubenswrapper[4672]: E0930 12:40:03.160890 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="sg-core" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.160901 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="sg-core" Sep 30 12:40:03 crc kubenswrapper[4672]: E0930 12:40:03.160916 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="proxy-httpd" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.160924 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="proxy-httpd" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.161106 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="proxy-httpd" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.161119 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-notification-agent" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.161138 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="sg-core" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.161157 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" containerName="ceilometer-central-agent" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.163091 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.164235 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b7c4888f-v7vn2" podStartSLOduration=3.164208794 podStartE2EDuration="3.164208794s" podCreationTimestamp="2025-09-30 12:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:03.142682134 +0000 UTC m=+1094.411919780" watchObservedRunningTime="2025-09-30 12:40:03.164208794 +0000 UTC m=+1094.433446440" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.166069 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.168425 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.205449 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.290654 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df9zn\" (UniqueName: \"kubernetes.io/projected/85c475fb-05e1-46e5-a396-6ea675ef8084-kube-api-access-df9zn\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.290733 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-log-httpd\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.290769 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.290801 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-config-data\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.290875 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-scripts\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.290915 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.290941 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-run-httpd\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.392538 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df9zn\" (UniqueName: \"kubernetes.io/projected/85c475fb-05e1-46e5-a396-6ea675ef8084-kube-api-access-df9zn\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.392893 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-log-httpd\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.392916 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.392937 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-config-data\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.392966 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-scripts\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.392988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.393006 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-run-httpd\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.393241 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-log-httpd\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.393432 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-run-httpd\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.399925 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.409624 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c5bf9886d-9nhb9" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.417797 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df9zn\" (UniqueName: \"kubernetes.io/projected/85c475fb-05e1-46e5-a396-6ea675ef8084-kube-api-access-df9zn\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.418642 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.419005 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-scripts\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.419119 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-config-data\") pod \"ceilometer-0\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " pod="openstack/ceilometer-0" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.429043 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a813f7c2-e727-42a0-8779-8619fe6e8165" path="/var/lib/kubelet/pods/a813f7c2-e727-42a0-8779-8619fe6e8165/volumes" Sep 30 12:40:03 crc kubenswrapper[4672]: I0930 12:40:03.482666 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:04 crc kubenswrapper[4672]: I0930 12:40:04.025195 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:04 crc kubenswrapper[4672]: I0930 12:40:04.126169 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerStarted","Data":"c1e1e832634a5beaf540c1da75d8e3028a68cf1fddb40ed71f559a9ff075e95f"} Sep 30 12:40:04 crc kubenswrapper[4672]: I0930 12:40:04.931493 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-844b6c9474-6tpzt" Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.021683 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8bdf69cc8-lsxz6"] Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.021953 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8bdf69cc8-lsxz6" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon-log" containerID="cri-o://b9bc69dd2b46d0c9dba498dbc9b87ca058be1c0ccc1ce3bb2e23ba57697d55b3" gracePeriod=30 Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.022465 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8bdf69cc8-lsxz6" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon" containerID="cri-o://66b370cdbebbcc3bf0abe4218b26bfdbdc8a2747d1cda48afa909303b72ead33" gracePeriod=30 Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.167480 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerStarted","Data":"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868"} Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.167538 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerStarted","Data":"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c"} Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.383541 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.383939 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:05 crc kubenswrapper[4672]: I0930 12:40:05.385586 4672 scope.go:117] "RemoveContainer" containerID="50db62107d03a042bff857d072808d9278376d38aa180a36c8e188d5d240b6c0" Sep 30 12:40:06 crc kubenswrapper[4672]: I0930 12:40:06.181235 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerStarted","Data":"9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3"} Sep 30 12:40:06 crc kubenswrapper[4672]: I0930 12:40:06.185390 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerStarted","Data":"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad"} Sep 30 12:40:07 crc kubenswrapper[4672]: I0930 12:40:07.258343 4672 generic.go:334] "Generic (PLEG): container finished" podID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerID="66b370cdbebbcc3bf0abe4218b26bfdbdc8a2747d1cda48afa909303b72ead33" exitCode=0 Sep 30 12:40:07 crc kubenswrapper[4672]: I0930 12:40:07.261379 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8bdf69cc8-lsxz6" event={"ID":"4e153eb6-5f25-4214-8e8a-14c37a36fc06","Type":"ContainerDied","Data":"66b370cdbebbcc3bf0abe4218b26bfdbdc8a2747d1cda48afa909303b72ead33"} Sep 30 12:40:07 crc kubenswrapper[4672]: I0930 12:40:07.777301 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8bdf69cc8-lsxz6" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.163:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.163:8443: connect: connection refused" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.232737 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.239234 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.243171 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.243173 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.243351 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hn2dp" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.255568 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.284489 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" containerID="c9b8ac0d2c2944aef03ffe7f96160d4edab5aa59318146e34fd05f492a680c0d" exitCode=0 Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.284629 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-llp7f" event={"ID":"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a","Type":"ContainerDied","Data":"c9b8ac0d2c2944aef03ffe7f96160d4edab5aa59318146e34fd05f492a680c0d"} Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.295012 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerStarted","Data":"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae"} Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.296064 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.357701 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvql\" (UniqueName: \"kubernetes.io/projected/c8982043-0fe6-4e59-901d-22a0d2e9a351-kube-api-access-4rvql\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.357815 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.357927 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.357944 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.416016 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.439431 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.376664409 podStartE2EDuration="5.439413373s" podCreationTimestamp="2025-09-30 12:40:03 +0000 UTC" firstStartedPulling="2025-09-30 12:40:04.029363212 +0000 UTC m=+1095.298600868" lastFinishedPulling="2025-09-30 12:40:07.092112186 +0000 UTC m=+1098.361349832" observedRunningTime="2025-09-30 12:40:08.343982566 +0000 UTC m=+1099.613220232" watchObservedRunningTime="2025-09-30 12:40:08.439413373 +0000 UTC m=+1099.708651019" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.459234 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.459394 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.459414 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.459519 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvql\" (UniqueName: \"kubernetes.io/projected/c8982043-0fe6-4e59-901d-22a0d2e9a351-kube-api-access-4rvql\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.461252 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.468379 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.479062 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.493393 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvql\" (UniqueName: \"kubernetes.io/projected/c8982043-0fe6-4e59-901d-22a0d2e9a351-kube-api-access-4rvql\") pod \"openstackclient\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.510379 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-87bdb45dc-rd86m"] Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.510725 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" podUID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerName="dnsmasq-dns" containerID="cri-o://a2dae8ca75c42aaca8cdefa9f55d04972e24db9dbd29681732d67195693be198" gracePeriod=10 Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.527917 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.528723 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.571830 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.576510 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.577773 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.589881 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.663753 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f655e1-08b3-4618-8864-2020e883f99c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.663813 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28f655e1-08b3-4618-8864-2020e883f99c-openstack-config\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.663841 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwkt\" (UniqueName: \"kubernetes.io/projected/28f655e1-08b3-4618-8864-2020e883f99c-kube-api-access-nwwkt\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.663990 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28f655e1-08b3-4618-8864-2020e883f99c-openstack-config-secret\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.767626 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f655e1-08b3-4618-8864-2020e883f99c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.767674 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28f655e1-08b3-4618-8864-2020e883f99c-openstack-config\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.767695 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwkt\" (UniqueName: \"kubernetes.io/projected/28f655e1-08b3-4618-8864-2020e883f99c-kube-api-access-nwwkt\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.767723 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28f655e1-08b3-4618-8864-2020e883f99c-openstack-config-secret\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.772310 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28f655e1-08b3-4618-8864-2020e883f99c-openstack-config\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.779139 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28f655e1-08b3-4618-8864-2020e883f99c-openstack-config-secret\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.788912 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f655e1-08b3-4618-8864-2020e883f99c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.802824 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwkt\" (UniqueName: \"kubernetes.io/projected/28f655e1-08b3-4618-8864-2020e883f99c-kube-api-access-nwwkt\") pod \"openstackclient\" (UID: \"28f655e1-08b3-4618-8864-2020e883f99c\") " pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: E0930 12:40:08.839065 4672 log.go:32] "RunPodSandbox from runtime service failed" err=< Sep 30 12:40:08 crc kubenswrapper[4672]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c8982043-0fe6-4e59-901d-22a0d2e9a351_0(74a839276c8d9d7c3a9836596ff7464a708017d99175027239e22d24ed3ff49d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"74a839276c8d9d7c3a9836596ff7464a708017d99175027239e22d24ed3ff49d" Netns:"/var/run/netns/45c1a970-3301-4b2b-980c-3d42172db45b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=74a839276c8d9d7c3a9836596ff7464a708017d99175027239e22d24ed3ff49d;K8S_POD_UID=c8982043-0fe6-4e59-901d-22a0d2e9a351" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c8982043-0fe6-4e59-901d-22a0d2e9a351]: expected pod UID "c8982043-0fe6-4e59-901d-22a0d2e9a351" but got "28f655e1-08b3-4618-8864-2020e883f99c" from Kube API Sep 30 12:40:08 crc kubenswrapper[4672]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Sep 30 12:40:08 crc kubenswrapper[4672]: > Sep 30 12:40:08 crc kubenswrapper[4672]: E0930 12:40:08.839136 4672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Sep 30 12:40:08 crc kubenswrapper[4672]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c8982043-0fe6-4e59-901d-22a0d2e9a351_0(74a839276c8d9d7c3a9836596ff7464a708017d99175027239e22d24ed3ff49d): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"74a839276c8d9d7c3a9836596ff7464a708017d99175027239e22d24ed3ff49d" Netns:"/var/run/netns/45c1a970-3301-4b2b-980c-3d42172db45b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=74a839276c8d9d7c3a9836596ff7464a708017d99175027239e22d24ed3ff49d;K8S_POD_UID=c8982043-0fe6-4e59-901d-22a0d2e9a351" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c8982043-0fe6-4e59-901d-22a0d2e9a351]: expected pod UID "c8982043-0fe6-4e59-901d-22a0d2e9a351" but got "28f655e1-08b3-4618-8864-2020e883f99c" from Kube API Sep 30 12:40:08 crc kubenswrapper[4672]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Sep 30 12:40:08 crc kubenswrapper[4672]: > pod="openstack/openstackclient" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.894576 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:40:08 crc kubenswrapper[4672]: I0930 12:40:08.982953 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c888d4d9d-4nr45" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.016283 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.060993 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8679776b6d-5ltf9"] Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.061198 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8679776b6d-5ltf9" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api-log" containerID="cri-o://a7421a9ec7c3523789edfcf1025f60a25c066b03056cb50d0c1e11f088795c32" gracePeriod=30 Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.061689 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8679776b6d-5ltf9" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api" containerID="cri-o://942b2d771116a4d5f33d53d0e4d71b134ec103c8d95b81a353d7feec16663451" gracePeriod=30 Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.397766 4672 generic.go:334] "Generic (PLEG): container finished" podID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerID="a7421a9ec7c3523789edfcf1025f60a25c066b03056cb50d0c1e11f088795c32" exitCode=143 Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.397894 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8679776b6d-5ltf9" event={"ID":"91b5c532-f855-46f2-967b-c53fc7f0ffee","Type":"ContainerDied","Data":"a7421a9ec7c3523789edfcf1025f60a25c066b03056cb50d0c1e11f088795c32"} Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.418568 4672 generic.go:334] "Generic (PLEG): container finished" podID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerID="a2dae8ca75c42aaca8cdefa9f55d04972e24db9dbd29681732d67195693be198" exitCode=0 Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.420401 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.454816 4672 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c8982043-0fe6-4e59-901d-22a0d2e9a351" podUID="28f655e1-08b3-4618-8864-2020e883f99c" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.456862 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" event={"ID":"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2","Type":"ContainerDied","Data":"a2dae8ca75c42aaca8cdefa9f55d04972e24db9dbd29681732d67195693be198"} Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.472498 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.492614 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config\") pod \"c8982043-0fe6-4e59-901d-22a0d2e9a351\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.492754 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-combined-ca-bundle\") pod \"c8982043-0fe6-4e59-901d-22a0d2e9a351\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.492861 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rvql\" (UniqueName: \"kubernetes.io/projected/c8982043-0fe6-4e59-901d-22a0d2e9a351-kube-api-access-4rvql\") pod \"c8982043-0fe6-4e59-901d-22a0d2e9a351\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.492914 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config-secret\") pod \"c8982043-0fe6-4e59-901d-22a0d2e9a351\" (UID: \"c8982043-0fe6-4e59-901d-22a0d2e9a351\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.493141 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c8982043-0fe6-4e59-901d-22a0d2e9a351" (UID: "c8982043-0fe6-4e59-901d-22a0d2e9a351"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.493526 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.508545 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8982043-0fe6-4e59-901d-22a0d2e9a351-kube-api-access-4rvql" (OuterVolumeSpecName: "kube-api-access-4rvql") pod "c8982043-0fe6-4e59-901d-22a0d2e9a351" (UID: "c8982043-0fe6-4e59-901d-22a0d2e9a351"). InnerVolumeSpecName "kube-api-access-4rvql". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.509557 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8982043-0fe6-4e59-901d-22a0d2e9a351" (UID: "c8982043-0fe6-4e59-901d-22a0d2e9a351"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.511519 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c8982043-0fe6-4e59-901d-22a0d2e9a351" (UID: "c8982043-0fe6-4e59-901d-22a0d2e9a351"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.597469 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rvql\" (UniqueName: \"kubernetes.io/projected/c8982043-0fe6-4e59-901d-22a0d2e9a351-kube-api-access-4rvql\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.597503 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.597513 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8982043-0fe6-4e59-901d-22a0d2e9a351-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.827784 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.832197 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.923900 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-swift-storage-0\") pod \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.924156 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-sb\") pod \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.924217 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-svc\") pod \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.924378 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-nb\") pod \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.924422 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-config\") pod \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.924464 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsb8k\" (UniqueName: \"kubernetes.io/projected/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-kube-api-access-lsb8k\") pod \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\" (UID: \"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2\") " Sep 30 12:40:09 crc kubenswrapper[4672]: I0930 12:40:09.946525 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-kube-api-access-lsb8k" (OuterVolumeSpecName: "kube-api-access-lsb8k") pod "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" (UID: "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2"). InnerVolumeSpecName "kube-api-access-lsb8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.026250 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsb8k\" (UniqueName: \"kubernetes.io/projected/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-kube-api-access-lsb8k\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.041993 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" (UID: "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.049548 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-config" (OuterVolumeSpecName: "config") pod "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" (UID: "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.061567 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" (UID: "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.062797 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" (UID: "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.069911 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" (UID: "b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.129588 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.129639 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.129653 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.129671 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.129683 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.150200 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-llp7f" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.230236 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-scripts\") pod \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.230348 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-combined-ca-bundle\") pod \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.230396 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-config-data\") pod \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.230452 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-etc-machine-id\") pod \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.230481 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-db-sync-config-data\") pod \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.230538 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzgds\" (UniqueName: \"kubernetes.io/projected/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-kube-api-access-tzgds\") pod \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\" (UID: \"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a\") " Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.231173 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" (UID: "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.235435 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-scripts" (OuterVolumeSpecName: "scripts") pod "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" (UID: "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.236102 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-kube-api-access-tzgds" (OuterVolumeSpecName: "kube-api-access-tzgds") pod "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" (UID: "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a"). InnerVolumeSpecName "kube-api-access-tzgds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.236644 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" (UID: "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.290476 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" (UID: "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.317579 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-config-data" (OuterVolumeSpecName: "config-data") pod "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" (UID: "9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.332609 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzgds\" (UniqueName: \"kubernetes.io/projected/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-kube-api-access-tzgds\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.332644 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.332658 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.332677 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.332690 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.332700 4672 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.452069 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"28f655e1-08b3-4618-8864-2020e883f99c","Type":"ContainerStarted","Data":"49b3eb8b1258d2a5a5b1333947eab75f5fdd180c9c8bfebf5a1a088e63b042f4"} Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.464858 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" event={"ID":"b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2","Type":"ContainerDied","Data":"01a1f00a22e6646c290e3d7cdfdee8d19cede08cc01e5f7b16f4ea5b8aebb2b4"} Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.464919 4672 scope.go:117] "RemoveContainer" containerID="a2dae8ca75c42aaca8cdefa9f55d04972e24db9dbd29681732d67195693be198" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.465060 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87bdb45dc-rd86m" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.477415 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-llp7f" event={"ID":"9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a","Type":"ContainerDied","Data":"6259c9fe43696eb4e5fa556d709e9d949210c01c9d2a0f15decec09cb95b30da"} Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.477686 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6259c9fe43696eb4e5fa556d709e9d949210c01c9d2a0f15decec09cb95b30da" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.477488 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-llp7f" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.477453 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.514497 4672 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c8982043-0fe6-4e59-901d-22a0d2e9a351" podUID="28f655e1-08b3-4618-8864-2020e883f99c" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.614650 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:10 crc kubenswrapper[4672]: E0930 12:40:10.615049 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" containerName="cinder-db-sync" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.615062 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" containerName="cinder-db-sync" Sep 30 12:40:10 crc kubenswrapper[4672]: E0930 12:40:10.615092 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerName="dnsmasq-dns" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.615100 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerName="dnsmasq-dns" Sep 30 12:40:10 crc kubenswrapper[4672]: E0930 12:40:10.615113 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerName="init" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.615119 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerName="init" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.615323 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" containerName="cinder-db-sync" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.615344 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" containerName="dnsmasq-dns" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.616330 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.624067 4672 scope.go:117] "RemoveContainer" containerID="f47c4a088df60f928a4b40e38da0134120f0babe26ab003bd0faa04e95716642" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.624749 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.624900 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.625000 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.625071 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m4shp" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.641421 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.641462 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.641485 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.641524 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.641541 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-scripts\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.641627 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqzd\" (UniqueName: \"kubernetes.io/projected/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-kube-api-access-ztqzd\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.669344 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.688767 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cd88f5d9f-4b79l"] Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.700739 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.702406 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-87bdb45dc-rd86m"] Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.727229 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-87bdb45dc-rd86m"] Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.747326 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cd88f5d9f-4b79l"] Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749508 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-nb\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749545 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749607 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djd8z\" (UniqueName: \"kubernetes.io/projected/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-kube-api-access-djd8z\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749624 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-swift-storage-0\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749651 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqzd\" (UniqueName: \"kubernetes.io/projected/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-kube-api-access-ztqzd\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749677 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-svc\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749721 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749737 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749759 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749795 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749812 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-scripts\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.749834 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-config\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.750171 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.777793 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-scripts\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.777961 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.778605 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.779526 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.785094 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqzd\" (UniqueName: \"kubernetes.io/projected/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-kube-api-access-ztqzd\") pod \"cinder-scheduler-0\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.836356 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.838151 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.851191 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.853608 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-config\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.853655 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-nb\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.853685 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.853742 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djd8z\" (UniqueName: \"kubernetes.io/projected/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-kube-api-access-djd8z\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.853760 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-swift-storage-0\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.853794 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-svc\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.854700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-svc\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.854949 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.855482 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-swift-storage-0\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.855922 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-config\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.856023 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-nb\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.877884 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.908099 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djd8z\" (UniqueName: \"kubernetes.io/projected/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-kube-api-access-djd8z\") pod \"dnsmasq-dns-7cd88f5d9f-4b79l\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.954160 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.955392 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hff\" (UniqueName: \"kubernetes.io/projected/582bc0ed-92ee-4696-9605-05e5e1c684f9-kube-api-access-r9hff\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.955587 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.955701 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582bc0ed-92ee-4696-9605-05e5e1c684f9-logs\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.955788 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.955880 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-scripts\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.955942 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:10 crc kubenswrapper[4672]: I0930 12:40:10.956009 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582bc0ed-92ee-4696-9605-05e5e1c684f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.059185 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-scripts\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.059695 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.059774 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582bc0ed-92ee-4696-9605-05e5e1c684f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.059866 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hff\" (UniqueName: \"kubernetes.io/projected/582bc0ed-92ee-4696-9605-05e5e1c684f9-kube-api-access-r9hff\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.059997 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.060088 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582bc0ed-92ee-4696-9605-05e5e1c684f9-logs\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.060157 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.060355 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582bc0ed-92ee-4696-9605-05e5e1c684f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.060759 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582bc0ed-92ee-4696-9605-05e5e1c684f9-logs\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.068047 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-scripts\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.068168 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.072377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.076785 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.095856 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hff\" (UniqueName: \"kubernetes.io/projected/582bc0ed-92ee-4696-9605-05e5e1c684f9-kube-api-access-r9hff\") pod \"cinder-api-0\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.158580 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.257663 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.448322 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2" path="/var/lib/kubelet/pods/b4c9f76e-4eaf-4d2c-a47c-a2051ed93ba2/volumes" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.449298 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8982043-0fe6-4e59-901d-22a0d2e9a351" path="/var/lib/kubelet/pods/c8982043-0fe6-4e59-901d-22a0d2e9a351/volumes" Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.680377 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:11 crc kubenswrapper[4672]: I0930 12:40:11.911673 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cd88f5d9f-4b79l"] Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.121405 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.582244 4672 generic.go:334] "Generic (PLEG): container finished" podID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerID="c046827976c63bd11dc57bea69df9b0a26a7ec36ffc84fc6ef05711ae83c32d3" exitCode=0 Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.583597 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" event={"ID":"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7","Type":"ContainerDied","Data":"c046827976c63bd11dc57bea69df9b0a26a7ec36ffc84fc6ef05711ae83c32d3"} Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.583674 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" event={"ID":"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7","Type":"ContainerStarted","Data":"0c619d2604d71f142a2c83ca5f492252a7b5169f65add151279214d7ee4fe269"} Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.679599 4672 generic.go:334] "Generic (PLEG): container finished" podID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerID="942b2d771116a4d5f33d53d0e4d71b134ec103c8d95b81a353d7feec16663451" exitCode=0 Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.679681 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8679776b6d-5ltf9" event={"ID":"91b5c532-f855-46f2-967b-c53fc7f0ffee","Type":"ContainerDied","Data":"942b2d771116a4d5f33d53d0e4d71b134ec103c8d95b81a353d7feec16663451"} Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.729984 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"582bc0ed-92ee-4696-9605-05e5e1c684f9","Type":"ContainerStarted","Data":"baf9e53664135f5ef8d4b59217bc337d28ab0a230166ba209aedd4b0359ab7f7"} Sep 30 12:40:12 crc kubenswrapper[4672]: I0930 12:40:12.757988 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2d9c4035-a25a-460e-b9bc-dbe05f4e6272","Type":"ContainerStarted","Data":"6c43048a5110bae2136a2222df43e25a515991bbcbcf3c4fb05a4559da15b10c"} Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.054403 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-844b6c9474-6tpzt" podUID="2659b35e-ecb1-416b-8a94-690759645536" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.164:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.091929 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.247348 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b5c532-f855-46f2-967b-c53fc7f0ffee-logs\") pod \"91b5c532-f855-46f2-967b-c53fc7f0ffee\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.247497 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8tzs\" (UniqueName: \"kubernetes.io/projected/91b5c532-f855-46f2-967b-c53fc7f0ffee-kube-api-access-b8tzs\") pod \"91b5c532-f855-46f2-967b-c53fc7f0ffee\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.247533 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-combined-ca-bundle\") pod \"91b5c532-f855-46f2-967b-c53fc7f0ffee\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.247643 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data-custom\") pod \"91b5c532-f855-46f2-967b-c53fc7f0ffee\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.247671 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b5c532-f855-46f2-967b-c53fc7f0ffee-logs" (OuterVolumeSpecName: "logs") pod "91b5c532-f855-46f2-967b-c53fc7f0ffee" (UID: "91b5c532-f855-46f2-967b-c53fc7f0ffee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.247744 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data\") pod \"91b5c532-f855-46f2-967b-c53fc7f0ffee\" (UID: \"91b5c532-f855-46f2-967b-c53fc7f0ffee\") " Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.249067 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91b5c532-f855-46f2-967b-c53fc7f0ffee-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.251854 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91b5c532-f855-46f2-967b-c53fc7f0ffee" (UID: "91b5c532-f855-46f2-967b-c53fc7f0ffee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.253094 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b5c532-f855-46f2-967b-c53fc7f0ffee-kube-api-access-b8tzs" (OuterVolumeSpecName: "kube-api-access-b8tzs") pod "91b5c532-f855-46f2-967b-c53fc7f0ffee" (UID: "91b5c532-f855-46f2-967b-c53fc7f0ffee"). InnerVolumeSpecName "kube-api-access-b8tzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.309450 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91b5c532-f855-46f2-967b-c53fc7f0ffee" (UID: "91b5c532-f855-46f2-967b-c53fc7f0ffee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.348594 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data" (OuterVolumeSpecName: "config-data") pod "91b5c532-f855-46f2-967b-c53fc7f0ffee" (UID: "91b5c532-f855-46f2-967b-c53fc7f0ffee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.350368 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8tzs\" (UniqueName: \"kubernetes.io/projected/91b5c532-f855-46f2-967b-c53fc7f0ffee-kube-api-access-b8tzs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.350593 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.350683 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.350750 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91b5c532-f855-46f2-967b-c53fc7f0ffee-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.569760 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.725759 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76bc57698f-2dqjq"] Sep 30 12:40:13 crc kubenswrapper[4672]: E0930 12:40:13.726196 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api-log" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.726212 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api-log" Sep 30 12:40:13 crc kubenswrapper[4672]: E0930 12:40:13.726236 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.726243 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.726497 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.726522 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api-log" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.727593 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.737176 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76bc57698f-2dqjq"] Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.746365 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.746505 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.746658 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760063 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb422aba-f5c2-4822-bd48-bba56e4dc451-run-httpd\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760113 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzkz\" (UniqueName: \"kubernetes.io/projected/eb422aba-f5c2-4822-bd48-bba56e4dc451-kube-api-access-chzkz\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760136 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-combined-ca-bundle\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760183 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb422aba-f5c2-4822-bd48-bba56e4dc451-etc-swift\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760218 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-public-tls-certs\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760369 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-internal-tls-certs\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760394 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-config-data\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.760418 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb422aba-f5c2-4822-bd48-bba56e4dc451-log-httpd\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.789781 4672 generic.go:334] "Generic (PLEG): container finished" podID="4f1bee84-650b-4f0b-a657-e6701ee51823" containerID="9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3" exitCode=1 Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.789851 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerDied","Data":"9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3"} Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.789891 4672 scope.go:117] "RemoveContainer" containerID="50db62107d03a042bff857d072808d9278376d38aa180a36c8e188d5d240b6c0" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.790621 4672 scope.go:117] "RemoveContainer" containerID="9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3" Sep 30 12:40:13 crc kubenswrapper[4672]: E0930 12:40:13.790902 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(4f1bee84-650b-4f0b-a657-e6701ee51823)\"" pod="openstack/watcher-decision-engine-0" podUID="4f1bee84-650b-4f0b-a657-e6701ee51823" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.804330 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2d9c4035-a25a-460e-b9bc-dbe05f4e6272","Type":"ContainerStarted","Data":"9989d443b0999bfc366753ecd2d90f19bed8c71eea22b903b90fc82977cb5434"} Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862340 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb422aba-f5c2-4822-bd48-bba56e4dc451-etc-swift\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862418 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-public-tls-certs\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862506 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-internal-tls-certs\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862532 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-config-data\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862578 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb422aba-f5c2-4822-bd48-bba56e4dc451-log-httpd\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862624 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb422aba-f5c2-4822-bd48-bba56e4dc451-run-httpd\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862653 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzkz\" (UniqueName: \"kubernetes.io/projected/eb422aba-f5c2-4822-bd48-bba56e4dc451-kube-api-access-chzkz\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.862679 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-combined-ca-bundle\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.875677 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb422aba-f5c2-4822-bd48-bba56e4dc451-run-httpd\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.875955 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb422aba-f5c2-4822-bd48-bba56e4dc451-log-httpd\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.876963 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-combined-ca-bundle\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.883875 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-internal-tls-certs\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.885336 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb422aba-f5c2-4822-bd48-bba56e4dc451-etc-swift\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.891155 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-config-data\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.892127 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb422aba-f5c2-4822-bd48-bba56e4dc451-public-tls-certs\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.914547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzkz\" (UniqueName: \"kubernetes.io/projected/eb422aba-f5c2-4822-bd48-bba56e4dc451-kube-api-access-chzkz\") pod \"swift-proxy-76bc57698f-2dqjq\" (UID: \"eb422aba-f5c2-4822-bd48-bba56e4dc451\") " pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.923233 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" event={"ID":"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7","Type":"ContainerStarted","Data":"16ae50728b02ce642ba9cc52fa14ff05e2bf4e7d5c66837e67a183bf7e1ae30e"} Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.923635 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:13 crc kubenswrapper[4672]: I0930 12:40:13.974657 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" podStartSLOduration=3.974640294 podStartE2EDuration="3.974640294s" podCreationTimestamp="2025-09-30 12:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:13.973857194 +0000 UTC m=+1105.243094840" watchObservedRunningTime="2025-09-30 12:40:13.974640294 +0000 UTC m=+1105.243877940" Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.073724 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8679776b6d-5ltf9" event={"ID":"91b5c532-f855-46f2-967b-c53fc7f0ffee","Type":"ContainerDied","Data":"13f071aa67adefe4e58df5faa85faf1804fffe7f9dd73acf07f6835d333ade1d"} Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.073893 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8679776b6d-5ltf9" Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.084859 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.111032 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"582bc0ed-92ee-4696-9605-05e5e1c684f9","Type":"ContainerStarted","Data":"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a"} Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.149890 4672 scope.go:117] "RemoveContainer" containerID="942b2d771116a4d5f33d53d0e4d71b134ec103c8d95b81a353d7feec16663451" Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.151508 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8679776b6d-5ltf9"] Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.158823 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8679776b6d-5ltf9"] Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.276554 4672 scope.go:117] "RemoveContainer" containerID="a7421a9ec7c3523789edfcf1025f60a25c066b03056cb50d0c1e11f088795c32" Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.656395 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.656646 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-central-agent" containerID="cri-o://38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c" gracePeriod=30 Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.657055 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="proxy-httpd" containerID="cri-o://a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae" gracePeriod=30 Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.657103 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="sg-core" containerID="cri-o://29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad" gracePeriod=30 Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.657136 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-notification-agent" containerID="cri-o://7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868" gracePeriod=30 Sep 30 12:40:14 crc kubenswrapper[4672]: I0930 12:40:14.916748 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76bc57698f-2dqjq"] Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.140456 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2d9c4035-a25a-460e-b9bc-dbe05f4e6272","Type":"ContainerStarted","Data":"f0fb977e5bc59e0ccbfef62faf5cba5e922cfa4b41bff63f473227ebe98a893c"} Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.150144 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76bc57698f-2dqjq" event={"ID":"eb422aba-f5c2-4822-bd48-bba56e4dc451","Type":"ContainerStarted","Data":"26b9819f928dc295e9d6d598485fb3dc4e0f908547a65a974da3e1668813c5c0"} Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.160647 4672 generic.go:334] "Generic (PLEG): container finished" podID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerID="a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae" exitCode=0 Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.160691 4672 generic.go:334] "Generic (PLEG): container finished" podID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerID="29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad" exitCode=2 Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.160775 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerDied","Data":"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae"} Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.160809 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerDied","Data":"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad"} Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.163454 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.700429947 podStartE2EDuration="5.163440328s" podCreationTimestamp="2025-09-30 12:40:10 +0000 UTC" firstStartedPulling="2025-09-30 12:40:11.713897212 +0000 UTC m=+1102.983134848" lastFinishedPulling="2025-09-30 12:40:12.176907583 +0000 UTC m=+1103.446145229" observedRunningTime="2025-09-30 12:40:15.15918448 +0000 UTC m=+1106.428422126" watchObservedRunningTime="2025-09-30 12:40:15.163440328 +0000 UTC m=+1106.432677974" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.166097 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"582bc0ed-92ee-4696-9605-05e5e1c684f9","Type":"ContainerStarted","Data":"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3"} Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.166234 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api-log" containerID="cri-o://63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a" gracePeriod=30 Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.166371 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api" containerID="cri-o://e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3" gracePeriod=30 Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.166609 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.187074 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.187056344 podStartE2EDuration="5.187056344s" podCreationTimestamp="2025-09-30 12:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:15.186472669 +0000 UTC m=+1106.455710305" watchObservedRunningTime="2025-09-30 12:40:15.187056344 +0000 UTC m=+1106.456293990" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.383960 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.384187 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.385051 4672 scope.go:117] "RemoveContainer" containerID="9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3" Sep 30 12:40:15 crc kubenswrapper[4672]: E0930 12:40:15.385933 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(4f1bee84-650b-4f0b-a657-e6701ee51823)\"" pod="openstack/watcher-decision-engine-0" podUID="4f1bee84-650b-4f0b-a657-e6701ee51823" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.467930 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" path="/var/lib/kubelet/pods/91b5c532-f855-46f2-967b-c53fc7f0ffee/volumes" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.849223 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.950892 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-sg-core-conf-yaml\") pod \"85c475fb-05e1-46e5-a396-6ea675ef8084\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.950961 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-log-httpd\") pod \"85c475fb-05e1-46e5-a396-6ea675ef8084\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.951099 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-run-httpd\") pod \"85c475fb-05e1-46e5-a396-6ea675ef8084\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.951182 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-combined-ca-bundle\") pod \"85c475fb-05e1-46e5-a396-6ea675ef8084\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.951312 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-config-data\") pod \"85c475fb-05e1-46e5-a396-6ea675ef8084\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.951406 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df9zn\" (UniqueName: \"kubernetes.io/projected/85c475fb-05e1-46e5-a396-6ea675ef8084-kube-api-access-df9zn\") pod \"85c475fb-05e1-46e5-a396-6ea675ef8084\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.951434 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-scripts\") pod \"85c475fb-05e1-46e5-a396-6ea675ef8084\" (UID: \"85c475fb-05e1-46e5-a396-6ea675ef8084\") " Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.953061 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "85c475fb-05e1-46e5-a396-6ea675ef8084" (UID: "85c475fb-05e1-46e5-a396-6ea675ef8084"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.953671 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "85c475fb-05e1-46e5-a396-6ea675ef8084" (UID: "85c475fb-05e1-46e5-a396-6ea675ef8084"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.960418 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c475fb-05e1-46e5-a396-6ea675ef8084-kube-api-access-df9zn" (OuterVolumeSpecName: "kube-api-access-df9zn") pod "85c475fb-05e1-46e5-a396-6ea675ef8084" (UID: "85c475fb-05e1-46e5-a396-6ea675ef8084"). InnerVolumeSpecName "kube-api-access-df9zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.960485 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.966423 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-scripts" (OuterVolumeSpecName: "scripts") pod "85c475fb-05e1-46e5-a396-6ea675ef8084" (UID: "85c475fb-05e1-46e5-a396-6ea675ef8084"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:15 crc kubenswrapper[4672]: I0930 12:40:15.987011 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "85c475fb-05e1-46e5-a396-6ea675ef8084" (UID: "85c475fb-05e1-46e5-a396-6ea675ef8084"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.054056 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df9zn\" (UniqueName: \"kubernetes.io/projected/85c475fb-05e1-46e5-a396-6ea675ef8084-kube-api-access-df9zn\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.054254 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.054357 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.054434 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.054486 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85c475fb-05e1-46e5-a396-6ea675ef8084-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.063813 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85c475fb-05e1-46e5-a396-6ea675ef8084" (UID: "85c475fb-05e1-46e5-a396-6ea675ef8084"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.071609 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.109864 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-config-data" (OuterVolumeSpecName: "config-data") pod "85c475fb-05e1-46e5-a396-6ea675ef8084" (UID: "85c475fb-05e1-46e5-a396-6ea675ef8084"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158055 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582bc0ed-92ee-4696-9605-05e5e1c684f9-etc-machine-id\") pod \"582bc0ed-92ee-4696-9605-05e5e1c684f9\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158117 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data-custom\") pod \"582bc0ed-92ee-4696-9605-05e5e1c684f9\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158229 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-scripts\") pod \"582bc0ed-92ee-4696-9605-05e5e1c684f9\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158288 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data\") pod \"582bc0ed-92ee-4696-9605-05e5e1c684f9\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158337 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9hff\" (UniqueName: \"kubernetes.io/projected/582bc0ed-92ee-4696-9605-05e5e1c684f9-kube-api-access-r9hff\") pod \"582bc0ed-92ee-4696-9605-05e5e1c684f9\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158370 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-combined-ca-bundle\") pod \"582bc0ed-92ee-4696-9605-05e5e1c684f9\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158462 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582bc0ed-92ee-4696-9605-05e5e1c684f9-logs\") pod \"582bc0ed-92ee-4696-9605-05e5e1c684f9\" (UID: \"582bc0ed-92ee-4696-9605-05e5e1c684f9\") " Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158555 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/582bc0ed-92ee-4696-9605-05e5e1c684f9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "582bc0ed-92ee-4696-9605-05e5e1c684f9" (UID: "582bc0ed-92ee-4696-9605-05e5e1c684f9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158938 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/582bc0ed-92ee-4696-9605-05e5e1c684f9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158955 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.158965 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c475fb-05e1-46e5-a396-6ea675ef8084-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.159178 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582bc0ed-92ee-4696-9605-05e5e1c684f9-logs" (OuterVolumeSpecName: "logs") pod "582bc0ed-92ee-4696-9605-05e5e1c684f9" (UID: "582bc0ed-92ee-4696-9605-05e5e1c684f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.165018 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-scripts" (OuterVolumeSpecName: "scripts") pod "582bc0ed-92ee-4696-9605-05e5e1c684f9" (UID: "582bc0ed-92ee-4696-9605-05e5e1c684f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.165171 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582bc0ed-92ee-4696-9605-05e5e1c684f9-kube-api-access-r9hff" (OuterVolumeSpecName: "kube-api-access-r9hff") pod "582bc0ed-92ee-4696-9605-05e5e1c684f9" (UID: "582bc0ed-92ee-4696-9605-05e5e1c684f9"). InnerVolumeSpecName "kube-api-access-r9hff". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.166396 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "582bc0ed-92ee-4696-9605-05e5e1c684f9" (UID: "582bc0ed-92ee-4696-9605-05e5e1c684f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.198398 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "582bc0ed-92ee-4696-9605-05e5e1c684f9" (UID: "582bc0ed-92ee-4696-9605-05e5e1c684f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.200252 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76bc57698f-2dqjq" event={"ID":"eb422aba-f5c2-4822-bd48-bba56e4dc451","Type":"ContainerStarted","Data":"db9593c0c22d60f00950d9885e2b811b7af94144b5550b4106d9ac08f0e93e44"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.200313 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76bc57698f-2dqjq" event={"ID":"eb422aba-f5c2-4822-bd48-bba56e4dc451","Type":"ContainerStarted","Data":"c9d50e1d7e3acac24be86f39aad25c453b6239919c8eaea30ce1e48f1c8fd1b9"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.200655 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.200682 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.204025 4672 generic.go:334] "Generic (PLEG): container finished" podID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerID="7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868" exitCode=0 Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.204048 4672 generic.go:334] "Generic (PLEG): container finished" podID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerID="38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c" exitCode=0 Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.204093 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerDied","Data":"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.204122 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerDied","Data":"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.204133 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85c475fb-05e1-46e5-a396-6ea675ef8084","Type":"ContainerDied","Data":"c1e1e832634a5beaf540c1da75d8e3028a68cf1fddb40ed71f559a9ff075e95f"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.204148 4672 scope.go:117] "RemoveContainer" containerID="a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.204290 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.209714 4672 generic.go:334] "Generic (PLEG): container finished" podID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerID="e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3" exitCode=0 Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.209754 4672 generic.go:334] "Generic (PLEG): container finished" podID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerID="63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a" exitCode=143 Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.209973 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.209971 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"582bc0ed-92ee-4696-9605-05e5e1c684f9","Type":"ContainerDied","Data":"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.210162 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"582bc0ed-92ee-4696-9605-05e5e1c684f9","Type":"ContainerDied","Data":"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.210223 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"582bc0ed-92ee-4696-9605-05e5e1c684f9","Type":"ContainerDied","Data":"baf9e53664135f5ef8d4b59217bc337d28ab0a230166ba209aedd4b0359ab7f7"} Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.210458 4672 scope.go:117] "RemoveContainer" containerID="9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.210692 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(4f1bee84-650b-4f0b-a657-e6701ee51823)\"" pod="openstack/watcher-decision-engine-0" podUID="4f1bee84-650b-4f0b-a657-e6701ee51823" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.220722 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data" (OuterVolumeSpecName: "config-data") pod "582bc0ed-92ee-4696-9605-05e5e1c684f9" (UID: "582bc0ed-92ee-4696-9605-05e5e1c684f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.225958 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76bc57698f-2dqjq" podStartSLOduration=3.225938204 podStartE2EDuration="3.225938204s" podCreationTimestamp="2025-09-30 12:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:16.222193149 +0000 UTC m=+1107.491430795" watchObservedRunningTime="2025-09-30 12:40:16.225938204 +0000 UTC m=+1107.495175850" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.235489 4672 scope.go:117] "RemoveContainer" containerID="29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.253570 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.260423 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.260458 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.260471 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9hff\" (UniqueName: \"kubernetes.io/projected/582bc0ed-92ee-4696-9605-05e5e1c684f9-kube-api-access-r9hff\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.260483 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.260494 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/582bc0ed-92ee-4696-9605-05e5e1c684f9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.260504 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/582bc0ed-92ee-4696-9605-05e5e1c684f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.279421 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.292228 4672 scope.go:117] "RemoveContainer" containerID="7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.293560 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.294092 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="sg-core" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.294111 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="sg-core" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.294128 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-notification-agent" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.294139 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-notification-agent" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.294161 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-central-agent" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.294167 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-central-agent" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.294188 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="proxy-httpd" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.294194 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="proxy-httpd" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.294203 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.294209 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.294227 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api-log" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.294234 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api-log" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.295327 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.295350 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="proxy-httpd" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.295379 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" containerName="cinder-api-log" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.295392 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-notification-agent" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.295407 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="ceilometer-central-agent" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.295419 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" containerName="sg-core" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.297103 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.311799 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.317357 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.318974 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.340132 4672 scope.go:117] "RemoveContainer" containerID="38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.374292 4672 scope.go:117] "RemoveContainer" containerID="a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.374770 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae\": container with ID starting with a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae not found: ID does not exist" containerID="a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.374819 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae"} err="failed to get container status \"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae\": rpc error: code = NotFound desc = could not find container \"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae\": container with ID starting with a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.374845 4672 scope.go:117] "RemoveContainer" containerID="29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.375233 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad\": container with ID starting with 29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad not found: ID does not exist" containerID="29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375272 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad"} err="failed to get container status \"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad\": rpc error: code = NotFound desc = could not find container \"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad\": container with ID starting with 29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375290 4672 scope.go:117] "RemoveContainer" containerID="7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.375496 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868\": container with ID starting with 7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868 not found: ID does not exist" containerID="7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375521 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868"} err="failed to get container status \"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868\": rpc error: code = NotFound desc = could not find container \"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868\": container with ID starting with 7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868 not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375537 4672 scope.go:117] "RemoveContainer" containerID="38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.375729 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c\": container with ID starting with 38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c not found: ID does not exist" containerID="38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375750 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c"} err="failed to get container status \"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c\": rpc error: code = NotFound desc = could not find container \"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c\": container with ID starting with 38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375763 4672 scope.go:117] "RemoveContainer" containerID="a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375941 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae"} err="failed to get container status \"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae\": rpc error: code = NotFound desc = could not find container \"a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae\": container with ID starting with a5d8df97f79020c7138fb924b03190513eed489e3594702b5dd41daea10fb8ae not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.375957 4672 scope.go:117] "RemoveContainer" containerID="29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.376107 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad"} err="failed to get container status \"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad\": rpc error: code = NotFound desc = could not find container \"29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad\": container with ID starting with 29408e85ecb19ec43497a2855301a731ee87f306b26ed600fba9b5bed991e7ad not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.376126 4672 scope.go:117] "RemoveContainer" containerID="7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.376604 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868"} err="failed to get container status \"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868\": rpc error: code = NotFound desc = could not find container \"7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868\": container with ID starting with 7342a51e4335f536ba49f316d02f4636252f1367a7f1db16251b89724da83868 not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.376658 4672 scope.go:117] "RemoveContainer" containerID="38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.376879 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c"} err="failed to get container status \"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c\": rpc error: code = NotFound desc = could not find container \"38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c\": container with ID starting with 38a65d69fcc43cb31ef1360c5518e4baa6266766e91f38a5f13a82dd1f93027c not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.376897 4672 scope.go:117] "RemoveContainer" containerID="e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.418733 4672 scope.go:117] "RemoveContainer" containerID="63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.450485 4672 scope.go:117] "RemoveContainer" containerID="e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.452188 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3\": container with ID starting with e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3 not found: ID does not exist" containerID="e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.452243 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3"} err="failed to get container status \"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3\": rpc error: code = NotFound desc = could not find container \"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3\": container with ID starting with e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3 not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.452288 4672 scope.go:117] "RemoveContainer" containerID="63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a" Sep 30 12:40:16 crc kubenswrapper[4672]: E0930 12:40:16.452636 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a\": container with ID starting with 63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a not found: ID does not exist" containerID="63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.452669 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a"} err="failed to get container status \"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a\": rpc error: code = NotFound desc = could not find container \"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a\": container with ID starting with 63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.452686 4672 scope.go:117] "RemoveContainer" containerID="e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.453066 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3"} err="failed to get container status \"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3\": rpc error: code = NotFound desc = could not find container \"e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3\": container with ID starting with e811108a08afa7898bc3a88bffd23c22774898340540e6e3e7a418e8ad8cbcf3 not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.453097 4672 scope.go:117] "RemoveContainer" containerID="63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.453618 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a"} err="failed to get container status \"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a\": rpc error: code = NotFound desc = could not find container \"63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a\": container with ID starting with 63b9121aa06c925b6471256a151d3133fe88c9ed70371db1f37a60140522f39a not found: ID does not exist" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.463964 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.464160 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk57k\" (UniqueName: \"kubernetes.io/projected/b0df982d-7077-4beb-b89b-71f80d0ff60c-kube-api-access-gk57k\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.464320 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-run-httpd\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.464483 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-log-httpd\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.464705 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-config-data\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.464834 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-scripts\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.464876 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.566871 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-log-httpd\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.566981 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-config-data\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.567051 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-scripts\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.567094 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.567232 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.567384 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-log-httpd\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.567406 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk57k\" (UniqueName: \"kubernetes.io/projected/b0df982d-7077-4beb-b89b-71f80d0ff60c-kube-api-access-gk57k\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.567485 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-run-httpd\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.568420 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-run-httpd\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.571159 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-config-data\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.578514 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-scripts\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.581532 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.581570 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.588257 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk57k\" (UniqueName: \"kubernetes.io/projected/b0df982d-7077-4beb-b89b-71f80d0ff60c-kube-api-access-gk57k\") pod \"ceilometer-0\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.643771 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.716603 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.732327 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.733890 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.735330 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.742315 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.760457 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.760495 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.760567 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874489 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-scripts\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874576 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874629 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874689 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874717 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-config-data\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874743 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874795 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzcjs\" (UniqueName: \"kubernetes.io/projected/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-kube-api-access-wzcjs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874838 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-logs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.874886 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976240 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-scripts\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976627 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976695 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976748 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976776 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-config-data\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976798 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976854 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzcjs\" (UniqueName: \"kubernetes.io/projected/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-kube-api-access-wzcjs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976901 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-logs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.976949 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.978990 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.980648 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-logs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.993036 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-scripts\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.995249 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-config-data\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.995902 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:16 crc kubenswrapper[4672]: I0930 12:40:16.996333 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.000223 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.012697 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.020972 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzcjs\" (UniqueName: \"kubernetes.io/projected/dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd-kube-api-access-wzcjs\") pod \"cinder-api-0\" (UID: \"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd\") " pod="openstack/cinder-api-0" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.218532 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 12:40:17 crc kubenswrapper[4672]: W0930 12:40:17.341782 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0df982d_7077_4beb_b89b_71f80d0ff60c.slice/crio-d82c9135e5af4d1cfe673c5992d3033b360a7c0544ef21b1503ac80cadd34c8a WatchSource:0}: Error finding container d82c9135e5af4d1cfe673c5992d3033b360a7c0544ef21b1503ac80cadd34c8a: Status 404 returned error can't find the container with id d82c9135e5af4d1cfe673c5992d3033b360a7c0544ef21b1503ac80cadd34c8a Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.347713 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.431238 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582bc0ed-92ee-4696-9605-05e5e1c684f9" path="/var/lib/kubelet/pods/582bc0ed-92ee-4696-9605-05e5e1c684f9/volumes" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.432117 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c475fb-05e1-46e5-a396-6ea675ef8084" path="/var/lib/kubelet/pods/85c475fb-05e1-46e5-a396-6ea675ef8084/volumes" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.779070 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8bdf69cc8-lsxz6" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.163:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.163:8443: connect: connection refused" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.854788 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 12:40:17 crc kubenswrapper[4672]: W0930 12:40:17.871169 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca6226f_d8f3_4d33_9d30_20bb6cf8a3fd.slice/crio-ce5fd8d0a5cce9b10edc58d1b3b127b7953abadfcc3c8ba7944caccb364be954 WatchSource:0}: Error finding container ce5fd8d0a5cce9b10edc58d1b3b127b7953abadfcc3c8ba7944caccb364be954: Status 404 returned error can't find the container with id ce5fd8d0a5cce9b10edc58d1b3b127b7953abadfcc3c8ba7944caccb364be954 Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.913388 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8679776b6d-5ltf9" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.174:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 12:40:17 crc kubenswrapper[4672]: I0930 12:40:17.913733 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8679776b6d-5ltf9" podUID="91b5c532-f855-46f2-967b-c53fc7f0ffee" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.174:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 12:40:18 crc kubenswrapper[4672]: I0930 12:40:18.302298 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerStarted","Data":"0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2"} Sep 30 12:40:18 crc kubenswrapper[4672]: I0930 12:40:18.302621 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerStarted","Data":"554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408"} Sep 30 12:40:18 crc kubenswrapper[4672]: I0930 12:40:18.302635 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerStarted","Data":"d82c9135e5af4d1cfe673c5992d3033b360a7c0544ef21b1503ac80cadd34c8a"} Sep 30 12:40:18 crc kubenswrapper[4672]: I0930 12:40:18.304712 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd","Type":"ContainerStarted","Data":"ce5fd8d0a5cce9b10edc58d1b3b127b7953abadfcc3c8ba7944caccb364be954"} Sep 30 12:40:18 crc kubenswrapper[4672]: I0930 12:40:18.674930 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:40:18 crc kubenswrapper[4672]: I0930 12:40:18.681693 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b77669d6-hlcjq" Sep 30 12:40:19 crc kubenswrapper[4672]: I0930 12:40:19.319596 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd","Type":"ContainerStarted","Data":"da68424989bb8de95e26e20a6ad9f75337ae760c1faa881ea6722e52663de589"} Sep 30 12:40:19 crc kubenswrapper[4672]: I0930 12:40:19.324107 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerStarted","Data":"2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b"} Sep 30 12:40:20 crc kubenswrapper[4672]: I0930 12:40:20.336698 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd","Type":"ContainerStarted","Data":"f02e9de402ce2e09e8adcc35f621c55779de85065d8d580be9b30a2471665f16"} Sep 30 12:40:20 crc kubenswrapper[4672]: I0930 12:40:20.337035 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 12:40:20 crc kubenswrapper[4672]: I0930 12:40:20.370890 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.370868024 podStartE2EDuration="4.370868024s" podCreationTimestamp="2025-09-30 12:40:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:20.356859451 +0000 UTC m=+1111.626097097" watchObservedRunningTime="2025-09-30 12:40:20.370868024 +0000 UTC m=+1111.640105670" Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.160586 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.264787 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85554c85d5-pkzn2"] Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.265042 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" podUID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerName="dnsmasq-dns" containerID="cri-o://4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32" gracePeriod=10 Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.374333 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerStarted","Data":"7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab"} Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.374405 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.409946 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.222664357 podStartE2EDuration="5.409926668s" podCreationTimestamp="2025-09-30 12:40:16 +0000 UTC" firstStartedPulling="2025-09-30 12:40:17.349637105 +0000 UTC m=+1108.618874741" lastFinishedPulling="2025-09-30 12:40:20.536899406 +0000 UTC m=+1111.806137052" observedRunningTime="2025-09-30 12:40:21.395952986 +0000 UTC m=+1112.665190632" watchObservedRunningTime="2025-09-30 12:40:21.409926668 +0000 UTC m=+1112.679164314" Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.414877 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.493589 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:21 crc kubenswrapper[4672]: I0930 12:40:21.961392 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.092154 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-swift-storage-0\") pod \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.092217 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-nb\") pod \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.092321 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-config\") pod \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.092356 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-sb\") pod \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.092418 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-svc\") pod \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.092461 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z57m4\" (UniqueName: \"kubernetes.io/projected/7998babe-964f-4ea9-a7a5-c11d5c7a6912-kube-api-access-z57m4\") pod \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\" (UID: \"7998babe-964f-4ea9-a7a5-c11d5c7a6912\") " Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.099845 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7998babe-964f-4ea9-a7a5-c11d5c7a6912-kube-api-access-z57m4" (OuterVolumeSpecName: "kube-api-access-z57m4") pod "7998babe-964f-4ea9-a7a5-c11d5c7a6912" (UID: "7998babe-964f-4ea9-a7a5-c11d5c7a6912"). InnerVolumeSpecName "kube-api-access-z57m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.159642 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-config" (OuterVolumeSpecName: "config") pod "7998babe-964f-4ea9-a7a5-c11d5c7a6912" (UID: "7998babe-964f-4ea9-a7a5-c11d5c7a6912"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.160670 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7998babe-964f-4ea9-a7a5-c11d5c7a6912" (UID: "7998babe-964f-4ea9-a7a5-c11d5c7a6912"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.161668 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7998babe-964f-4ea9-a7a5-c11d5c7a6912" (UID: "7998babe-964f-4ea9-a7a5-c11d5c7a6912"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.174927 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7998babe-964f-4ea9-a7a5-c11d5c7a6912" (UID: "7998babe-964f-4ea9-a7a5-c11d5c7a6912"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.194466 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.194498 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.194507 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.194515 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.194524 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z57m4\" (UniqueName: \"kubernetes.io/projected/7998babe-964f-4ea9-a7a5-c11d5c7a6912-kube-api-access-z57m4\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.202165 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7998babe-964f-4ea9-a7a5-c11d5c7a6912" (UID: "7998babe-964f-4ea9-a7a5-c11d5c7a6912"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.296281 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7998babe-964f-4ea9-a7a5-c11d5c7a6912-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.385197 4672 generic.go:334] "Generic (PLEG): container finished" podID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerID="4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32" exitCode=0 Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.385232 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" event={"ID":"7998babe-964f-4ea9-a7a5-c11d5c7a6912","Type":"ContainerDied","Data":"4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32"} Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.385306 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" event={"ID":"7998babe-964f-4ea9-a7a5-c11d5c7a6912","Type":"ContainerDied","Data":"4d5d624a895f9e3589f21d22931a44a6d88d30b8020ea734e35c87d88afcd311"} Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.385324 4672 scope.go:117] "RemoveContainer" containerID="4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.385343 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85554c85d5-pkzn2" Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.386624 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="cinder-scheduler" containerID="cri-o://9989d443b0999bfc366753ecd2d90f19bed8c71eea22b903b90fc82977cb5434" gracePeriod=30 Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.386701 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="probe" containerID="cri-o://f0fb977e5bc59e0ccbfef62faf5cba5e922cfa4b41bff63f473227ebe98a893c" gracePeriod=30 Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.421452 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85554c85d5-pkzn2"] Sep 30 12:40:22 crc kubenswrapper[4672]: I0930 12:40:22.433336 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85554c85d5-pkzn2"] Sep 30 12:40:23 crc kubenswrapper[4672]: I0930 12:40:23.401575 4672 generic.go:334] "Generic (PLEG): container finished" podID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerID="f0fb977e5bc59e0ccbfef62faf5cba5e922cfa4b41bff63f473227ebe98a893c" exitCode=0 Sep 30 12:40:23 crc kubenswrapper[4672]: I0930 12:40:23.401621 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2d9c4035-a25a-460e-b9bc-dbe05f4e6272","Type":"ContainerDied","Data":"f0fb977e5bc59e0ccbfef62faf5cba5e922cfa4b41bff63f473227ebe98a893c"} Sep 30 12:40:23 crc kubenswrapper[4672]: I0930 12:40:23.439688 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" path="/var/lib/kubelet/pods/7998babe-964f-4ea9-a7a5-c11d5c7a6912/volumes" Sep 30 12:40:24 crc kubenswrapper[4672]: I0930 12:40:24.090013 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:24 crc kubenswrapper[4672]: I0930 12:40:24.119897 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76bc57698f-2dqjq" Sep 30 12:40:24 crc kubenswrapper[4672]: I0930 12:40:24.413423 4672 generic.go:334] "Generic (PLEG): container finished" podID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerID="9989d443b0999bfc366753ecd2d90f19bed8c71eea22b903b90fc82977cb5434" exitCode=0 Sep 30 12:40:24 crc kubenswrapper[4672]: I0930 12:40:24.413504 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2d9c4035-a25a-460e-b9bc-dbe05f4e6272","Type":"ContainerDied","Data":"9989d443b0999bfc366753ecd2d90f19bed8c71eea22b903b90fc82977cb5434"} Sep 30 12:40:26 crc kubenswrapper[4672]: I0930 12:40:26.417694 4672 scope.go:117] "RemoveContainer" containerID="9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3" Sep 30 12:40:26 crc kubenswrapper[4672]: E0930 12:40:26.418275 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(4f1bee84-650b-4f0b-a657-e6701ee51823)\"" pod="openstack/watcher-decision-engine-0" podUID="4f1bee84-650b-4f0b-a657-e6701ee51823" Sep 30 12:40:27 crc kubenswrapper[4672]: I0930 12:40:27.777068 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8bdf69cc8-lsxz6" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.163:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.163:8443: connect: connection refused" Sep 30 12:40:27 crc kubenswrapper[4672]: I0930 12:40:27.777208 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:40:27 crc kubenswrapper[4672]: I0930 12:40:27.972008 4672 scope.go:117] "RemoveContainer" containerID="3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.167006 4672 scope.go:117] "RemoveContainer" containerID="4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32" Sep 30 12:40:28 crc kubenswrapper[4672]: E0930 12:40:28.171127 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32\": container with ID starting with 4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32 not found: ID does not exist" containerID="4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.171168 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32"} err="failed to get container status \"4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32\": rpc error: code = NotFound desc = could not find container \"4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32\": container with ID starting with 4f5ec6fc0656c183408c9bf1d16178763050e93f2d4f808f42477bdef2ddea32 not found: ID does not exist" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.171193 4672 scope.go:117] "RemoveContainer" containerID="3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87" Sep 30 12:40:28 crc kubenswrapper[4672]: E0930 12:40:28.171724 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87\": container with ID starting with 3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87 not found: ID does not exist" containerID="3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.171746 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87"} err="failed to get container status \"3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87\": rpc error: code = NotFound desc = could not find container \"3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87\": container with ID starting with 3dbd5f5ea1fe4f36a583bd46e4d9a3f1216ea4da2366eb7b963f5da37a0def87 not found: ID does not exist" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.261792 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.316856 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data\") pod \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.316959 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-etc-machine-id\") pod \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.316986 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztqzd\" (UniqueName: \"kubernetes.io/projected/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-kube-api-access-ztqzd\") pod \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.317024 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-combined-ca-bundle\") pod \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.317087 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-scripts\") pod \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.317155 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data-custom\") pod \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\" (UID: \"2d9c4035-a25a-460e-b9bc-dbe05f4e6272\") " Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.317517 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2d9c4035-a25a-460e-b9bc-dbe05f4e6272" (UID: "2d9c4035-a25a-460e-b9bc-dbe05f4e6272"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.317948 4672 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.322397 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-scripts" (OuterVolumeSpecName: "scripts") pod "2d9c4035-a25a-460e-b9bc-dbe05f4e6272" (UID: "2d9c4035-a25a-460e-b9bc-dbe05f4e6272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.323423 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2d9c4035-a25a-460e-b9bc-dbe05f4e6272" (UID: "2d9c4035-a25a-460e-b9bc-dbe05f4e6272"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.328754 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-kube-api-access-ztqzd" (OuterVolumeSpecName: "kube-api-access-ztqzd") pod "2d9c4035-a25a-460e-b9bc-dbe05f4e6272" (UID: "2d9c4035-a25a-460e-b9bc-dbe05f4e6272"). InnerVolumeSpecName "kube-api-access-ztqzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.392535 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d9c4035-a25a-460e-b9bc-dbe05f4e6272" (UID: "2d9c4035-a25a-460e-b9bc-dbe05f4e6272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.419575 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.419600 4672 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.419610 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztqzd\" (UniqueName: \"kubernetes.io/projected/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-kube-api-access-ztqzd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.419621 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.425318 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data" (OuterVolumeSpecName: "config-data") pod "2d9c4035-a25a-460e-b9bc-dbe05f4e6272" (UID: "2d9c4035-a25a-460e-b9bc-dbe05f4e6272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.468370 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"28f655e1-08b3-4618-8864-2020e883f99c","Type":"ContainerStarted","Data":"64e71f186f29c02b5e65e9ab3ad6ebd3c6ed4d6522fa1dfdb603e2d0c92624b6"} Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.472742 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2d9c4035-a25a-460e-b9bc-dbe05f4e6272","Type":"ContainerDied","Data":"6c43048a5110bae2136a2222df43e25a515991bbcbcf3c4fb05a4559da15b10c"} Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.472799 4672 scope.go:117] "RemoveContainer" containerID="f0fb977e5bc59e0ccbfef62faf5cba5e922cfa4b41bff63f473227ebe98a893c" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.472914 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.487662 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.329343016 podStartE2EDuration="20.487643455s" podCreationTimestamp="2025-09-30 12:40:08 +0000 UTC" firstStartedPulling="2025-09-30 12:40:09.839410806 +0000 UTC m=+1101.108648452" lastFinishedPulling="2025-09-30 12:40:27.997711245 +0000 UTC m=+1119.266948891" observedRunningTime="2025-09-30 12:40:28.481826848 +0000 UTC m=+1119.751064494" watchObservedRunningTime="2025-09-30 12:40:28.487643455 +0000 UTC m=+1119.756881101" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.510770 4672 scope.go:117] "RemoveContainer" containerID="9989d443b0999bfc366753ecd2d90f19bed8c71eea22b903b90fc82977cb5434" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.513560 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.531931 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c4035-a25a-460e-b9bc-dbe05f4e6272-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.533294 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.546288 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:28 crc kubenswrapper[4672]: E0930 12:40:28.546733 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="cinder-scheduler" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.546751 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="cinder-scheduler" Sep 30 12:40:28 crc kubenswrapper[4672]: E0930 12:40:28.546777 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="probe" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.546784 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="probe" Sep 30 12:40:28 crc kubenswrapper[4672]: E0930 12:40:28.546806 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerName="init" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.546812 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerName="init" Sep 30 12:40:28 crc kubenswrapper[4672]: E0930 12:40:28.546826 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerName="dnsmasq-dns" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.546832 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerName="dnsmasq-dns" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.546997 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="probe" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.547016 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7998babe-964f-4ea9-a7a5-c11d5c7a6912" containerName="dnsmasq-dns" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.547038 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" containerName="cinder-scheduler" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.548094 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.555337 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.556664 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.569843 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.650066 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3140cbea-70fb-4d82-90d3-fa12c43fcf76-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.650149 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-config-data\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.650255 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.650312 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-scripts\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.650359 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj884\" (UniqueName: \"kubernetes.io/projected/3140cbea-70fb-4d82-90d3-fa12c43fcf76-kube-api-access-xj884\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.650426 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.753503 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3140cbea-70fb-4d82-90d3-fa12c43fcf76-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.753597 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-config-data\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.753678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.753718 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-scripts\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.753770 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj884\" (UniqueName: \"kubernetes.io/projected/3140cbea-70fb-4d82-90d3-fa12c43fcf76-kube-api-access-xj884\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.753836 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.767670 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3140cbea-70fb-4d82-90d3-fa12c43fcf76-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.768088 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.781173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.782639 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-scripts\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.790145 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3140cbea-70fb-4d82-90d3-fa12c43fcf76-config-data\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.799009 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj884\" (UniqueName: \"kubernetes.io/projected/3140cbea-70fb-4d82-90d3-fa12c43fcf76-kube-api-access-xj884\") pod \"cinder-scheduler-0\" (UID: \"3140cbea-70fb-4d82-90d3-fa12c43fcf76\") " pod="openstack/cinder-scheduler-0" Sep 30 12:40:28 crc kubenswrapper[4672]: I0930 12:40:28.881373 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.415877 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.439311 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9c4035-a25a-460e-b9bc-dbe05f4e6272" path="/var/lib/kubelet/pods/2d9c4035-a25a-460e-b9bc-dbe05f4e6272/volumes" Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.486596 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3140cbea-70fb-4d82-90d3-fa12c43fcf76","Type":"ContainerStarted","Data":"11d2f4b326110d6b0d03ffe8e3235262e6a0aa98bc19dbeb07d26bc44b8774cd"} Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.592554 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.593889 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-central-agent" containerID="cri-o://554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408" gracePeriod=30 Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.595079 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="proxy-httpd" containerID="cri-o://7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab" gracePeriod=30 Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.596087 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="sg-core" containerID="cri-o://2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b" gracePeriod=30 Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.596204 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-notification-agent" containerID="cri-o://0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2" gracePeriod=30 Sep 30 12:40:29 crc kubenswrapper[4672]: I0930 12:40:29.915794 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 12:40:30 crc kubenswrapper[4672]: I0930 12:40:30.500088 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3140cbea-70fb-4d82-90d3-fa12c43fcf76","Type":"ContainerStarted","Data":"56fc7da0738ba826e31ecb87aa261a5832a62e0efd391e5958e2737c7c263a87"} Sep 30 12:40:30 crc kubenswrapper[4672]: I0930 12:40:30.512786 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerID="7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab" exitCode=0 Sep 30 12:40:30 crc kubenswrapper[4672]: I0930 12:40:30.512818 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerID="2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b" exitCode=2 Sep 30 12:40:30 crc kubenswrapper[4672]: I0930 12:40:30.512827 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerID="554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408" exitCode=0 Sep 30 12:40:30 crc kubenswrapper[4672]: I0930 12:40:30.512846 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerDied","Data":"7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab"} Sep 30 12:40:30 crc kubenswrapper[4672]: I0930 12:40:30.512873 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerDied","Data":"2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b"} Sep 30 12:40:30 crc kubenswrapper[4672]: I0930 12:40:30.512883 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerDied","Data":"554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408"} Sep 30 12:40:31 crc kubenswrapper[4672]: I0930 12:40:31.310820 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b7c4888f-v7vn2" Sep 30 12:40:31 crc kubenswrapper[4672]: I0930 12:40:31.385304 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56684fbfb-t69x4"] Sep 30 12:40:31 crc kubenswrapper[4672]: I0930 12:40:31.385570 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56684fbfb-t69x4" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-api" containerID="cri-o://15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab" gracePeriod=30 Sep 30 12:40:31 crc kubenswrapper[4672]: I0930 12:40:31.386660 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56684fbfb-t69x4" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-httpd" containerID="cri-o://eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0" gracePeriod=30 Sep 30 12:40:31 crc kubenswrapper[4672]: I0930 12:40:31.526933 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3140cbea-70fb-4d82-90d3-fa12c43fcf76","Type":"ContainerStarted","Data":"063f4bb1676236bc29289dfc18140a85db175d6608fd32c254f0385ddb9fae2b"} Sep 30 12:40:31 crc kubenswrapper[4672]: I0930 12:40:31.556945 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.556921307 podStartE2EDuration="3.556921307s" podCreationTimestamp="2025-09-30 12:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:31.552972598 +0000 UTC m=+1122.822210274" watchObservedRunningTime="2025-09-30 12:40:31.556921307 +0000 UTC m=+1122.826158963" Sep 30 12:40:32 crc kubenswrapper[4672]: I0930 12:40:32.051315 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:40:32 crc kubenswrapper[4672]: I0930 12:40:32.051627 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-log" containerID="cri-o://d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785" gracePeriod=30 Sep 30 12:40:32 crc kubenswrapper[4672]: I0930 12:40:32.051728 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-httpd" containerID="cri-o://fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487" gracePeriod=30 Sep 30 12:40:32 crc kubenswrapper[4672]: I0930 12:40:32.537839 4672 generic.go:334] "Generic (PLEG): container finished" podID="6d32cb40-920a-4b27-bf27-9362601aabae" containerID="eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0" exitCode=0 Sep 30 12:40:32 crc kubenswrapper[4672]: I0930 12:40:32.537887 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56684fbfb-t69x4" event={"ID":"6d32cb40-920a-4b27-bf27-9362601aabae","Type":"ContainerDied","Data":"eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0"} Sep 30 12:40:32 crc kubenswrapper[4672]: I0930 12:40:32.540666 4672 generic.go:334] "Generic (PLEG): container finished" podID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerID="d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785" exitCode=143 Sep 30 12:40:32 crc kubenswrapper[4672]: I0930 12:40:32.540723 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74285663-1d6c-4a5f-8fe4-5406010f7ad7","Type":"ContainerDied","Data":"d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785"} Sep 30 12:40:33 crc kubenswrapper[4672]: I0930 12:40:33.883087 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.365529 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.371762 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.417620 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.417660 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-config-data\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.417810 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-httpd-run\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.417878 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.417925 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-combined-ca-bundle\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.417955 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-scripts\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.418007 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-logs\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.418026 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkwxb\" (UniqueName: \"kubernetes.io/projected/74285663-1d6c-4a5f-8fe4-5406010f7ad7-kube-api-access-tkwxb\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.424157 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.427904 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-logs" (OuterVolumeSpecName: "logs") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.433013 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.437969 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74285663-1d6c-4a5f-8fe4-5406010f7ad7-kube-api-access-tkwxb" (OuterVolumeSpecName: "kube-api-access-tkwxb") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "kube-api-access-tkwxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.479046 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-scripts" (OuterVolumeSpecName: "scripts") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.498890 4672 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs podName:74285663-1d6c-4a5f-8fe4-5406010f7ad7 nodeName:}" failed. No retries permitted until 2025-09-30 12:40:34.998860735 +0000 UTC m=+1126.268098381 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7") : error deleting /var/lib/kubelet/pods/74285663-1d6c-4a5f-8fe4-5406010f7ad7/volume-subpaths: remove /var/lib/kubelet/pods/74285663-1d6c-4a5f-8fe4-5406010f7ad7/volume-subpaths: no such file or directory Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.501420 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.501858 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-config-data" (OuterVolumeSpecName: "config-data") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.519446 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-run-httpd\") pod \"b0df982d-7077-4beb-b89b-71f80d0ff60c\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.519496 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk57k\" (UniqueName: \"kubernetes.io/projected/b0df982d-7077-4beb-b89b-71f80d0ff60c-kube-api-access-gk57k\") pod \"b0df982d-7077-4beb-b89b-71f80d0ff60c\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.519534 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-scripts\") pod \"b0df982d-7077-4beb-b89b-71f80d0ff60c\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.519568 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-config-data\") pod \"b0df982d-7077-4beb-b89b-71f80d0ff60c\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.519609 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-combined-ca-bundle\") pod \"b0df982d-7077-4beb-b89b-71f80d0ff60c\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.519735 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-sg-core-conf-yaml\") pod \"b0df982d-7077-4beb-b89b-71f80d0ff60c\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.519780 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-log-httpd\") pod \"b0df982d-7077-4beb-b89b-71f80d0ff60c\" (UID: \"b0df982d-7077-4beb-b89b-71f80d0ff60c\") " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520167 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0df982d-7077-4beb-b89b-71f80d0ff60c" (UID: "b0df982d-7077-4beb-b89b-71f80d0ff60c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520543 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0df982d-7077-4beb-b89b-71f80d0ff60c" (UID: "b0df982d-7077-4beb-b89b-71f80d0ff60c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520883 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520901 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520912 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkwxb\" (UniqueName: \"kubernetes.io/projected/74285663-1d6c-4a5f-8fe4-5406010f7ad7-kube-api-access-tkwxb\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520924 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520947 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520957 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520965 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74285663-1d6c-4a5f-8fe4-5406010f7ad7-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520975 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0df982d-7077-4beb-b89b-71f80d0ff60c-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.520985 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.525381 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0df982d-7077-4beb-b89b-71f80d0ff60c-kube-api-access-gk57k" (OuterVolumeSpecName: "kube-api-access-gk57k") pod "b0df982d-7077-4beb-b89b-71f80d0ff60c" (UID: "b0df982d-7077-4beb-b89b-71f80d0ff60c"). InnerVolumeSpecName "kube-api-access-gk57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.529310 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-scripts" (OuterVolumeSpecName: "scripts") pod "b0df982d-7077-4beb-b89b-71f80d0ff60c" (UID: "b0df982d-7077-4beb-b89b-71f80d0ff60c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.553501 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.554638 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0df982d-7077-4beb-b89b-71f80d0ff60c" (UID: "b0df982d-7077-4beb-b89b-71f80d0ff60c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.561426 4672 generic.go:334] "Generic (PLEG): container finished" podID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerID="fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487" exitCode=0 Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.561489 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74285663-1d6c-4a5f-8fe4-5406010f7ad7","Type":"ContainerDied","Data":"fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487"} Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.561519 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74285663-1d6c-4a5f-8fe4-5406010f7ad7","Type":"ContainerDied","Data":"c4fa4b4697a7d4e559c9dfefb63c7925003f2cf08d57e152c3d0ba680a3a9338"} Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.561537 4672 scope.go:117] "RemoveContainer" containerID="fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.561692 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.568856 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerID="0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2" exitCode=0 Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.568908 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerDied","Data":"0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2"} Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.568918 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.568941 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0df982d-7077-4beb-b89b-71f80d0ff60c","Type":"ContainerDied","Data":"d82c9135e5af4d1cfe673c5992d3033b360a7c0544ef21b1503ac80cadd34c8a"} Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.593775 4672 scope.go:117] "RemoveContainer" containerID="d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.613727 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0df982d-7077-4beb-b89b-71f80d0ff60c" (UID: "b0df982d-7077-4beb-b89b-71f80d0ff60c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.623102 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.623138 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk57k\" (UniqueName: \"kubernetes.io/projected/b0df982d-7077-4beb-b89b-71f80d0ff60c-kube-api-access-gk57k\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.623154 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.623171 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.623183 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.623377 4672 scope.go:117] "RemoveContainer" containerID="fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.624037 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487\": container with ID starting with fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487 not found: ID does not exist" containerID="fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.624068 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487"} err="failed to get container status \"fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487\": rpc error: code = NotFound desc = could not find container \"fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487\": container with ID starting with fc33fa8c4b6ade8c830adbda8d589972a4d69532494b8c12e5a44526c2b83487 not found: ID does not exist" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.624090 4672 scope.go:117] "RemoveContainer" containerID="d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.625733 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785\": container with ID starting with d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785 not found: ID does not exist" containerID="d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.625759 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785"} err="failed to get container status \"d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785\": rpc error: code = NotFound desc = could not find container \"d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785\": container with ID starting with d74e8f999400e418647fed7fb090090cc62f2a029e4d018ea2767558c4255785 not found: ID does not exist" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.625778 4672 scope.go:117] "RemoveContainer" containerID="7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.638611 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-config-data" (OuterVolumeSpecName: "config-data") pod "b0df982d-7077-4beb-b89b-71f80d0ff60c" (UID: "b0df982d-7077-4beb-b89b-71f80d0ff60c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.643956 4672 scope.go:117] "RemoveContainer" containerID="2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.663413 4672 scope.go:117] "RemoveContainer" containerID="0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.689465 4672 scope.go:117] "RemoveContainer" containerID="554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.722988 4672 scope.go:117] "RemoveContainer" containerID="7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.723563 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab\": container with ID starting with 7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab not found: ID does not exist" containerID="7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.723590 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab"} err="failed to get container status \"7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab\": rpc error: code = NotFound desc = could not find container \"7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab\": container with ID starting with 7487975c7a56fd498fa2817ac6806020d38cf78f7b968d700e38e3e4d42711ab not found: ID does not exist" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.723612 4672 scope.go:117] "RemoveContainer" containerID="2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.723848 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b\": container with ID starting with 2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b not found: ID does not exist" containerID="2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.723867 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b"} err="failed to get container status \"2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b\": rpc error: code = NotFound desc = could not find container \"2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b\": container with ID starting with 2a7a9bb780685179558ca557a5cc71758d032f6908cd011ecc193fd13b6ffe0b not found: ID does not exist" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.723879 4672 scope.go:117] "RemoveContainer" containerID="0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.724062 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2\": container with ID starting with 0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2 not found: ID does not exist" containerID="0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.724079 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2"} err="failed to get container status \"0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2\": rpc error: code = NotFound desc = could not find container \"0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2\": container with ID starting with 0ab65509d23060ca08255981abbcbc6d62f2ddb4cd4e56bda339ec7af2a291f2 not found: ID does not exist" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.724092 4672 scope.go:117] "RemoveContainer" containerID="554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.724251 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408\": container with ID starting with 554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408 not found: ID does not exist" containerID="554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.724282 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408"} err="failed to get container status \"554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408\": rpc error: code = NotFound desc = could not find container \"554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408\": container with ID starting with 554bc7b264b9b315010e9f7f81bcf8f5cf55d91fce9c72601b686b4343921408 not found: ID does not exist" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.724739 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0df982d-7077-4beb-b89b-71f80d0ff60c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.900673 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.911646 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.928410 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.928825 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="proxy-httpd" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.928846 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="proxy-httpd" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.928866 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-central-agent" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.928873 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-central-agent" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.928890 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="sg-core" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.928896 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="sg-core" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.928913 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-notification-agent" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.928921 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-notification-agent" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.928943 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-log" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.928951 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-log" Sep 30 12:40:34 crc kubenswrapper[4672]: E0930 12:40:34.928967 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-httpd" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.928972 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-httpd" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.929173 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="sg-core" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.929183 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-log" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.929191 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-central-agent" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.929203 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="proxy-httpd" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.929212 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" containerName="glance-httpd" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.929220 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" containerName="ceilometer-notification-agent" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.931411 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.933948 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.935634 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:40:34 crc kubenswrapper[4672]: I0930 12:40:34.947549 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.036257 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs\") pod \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\" (UID: \"74285663-1d6c-4a5f-8fe4-5406010f7ad7\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.036737 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-log-httpd\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.036793 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-run-httpd\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.036830 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-scripts\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.036907 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.036942 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.037011 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v6p2\" (UniqueName: \"kubernetes.io/projected/c6a15e81-6934-4bba-818b-df43d93cbd0e-kube-api-access-7v6p2\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.037044 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-config-data\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.050887 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "74285663-1d6c-4a5f-8fe4-5406010f7ad7" (UID: "74285663-1d6c-4a5f-8fe4-5406010f7ad7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138455 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-log-httpd\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138502 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-run-httpd\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138527 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-scripts\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138556 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138579 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138615 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v6p2\" (UniqueName: \"kubernetes.io/projected/c6a15e81-6934-4bba-818b-df43d93cbd0e-kube-api-access-7v6p2\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138634 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-config-data\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.138723 4672 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74285663-1d6c-4a5f-8fe4-5406010f7ad7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.139642 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-log-httpd\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.139825 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-run-httpd\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.143862 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-scripts\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.148434 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-config-data\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.149350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.164734 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v6p2\" (UniqueName: \"kubernetes.io/projected/c6a15e81-6934-4bba-818b-df43d93cbd0e-kube-api-access-7v6p2\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.169743 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.246968 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.253840 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.281608 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.285020 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.287598 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.297545 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.303913 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.304012 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346565 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346668 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346730 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346764 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346802 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346829 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9vf6\" (UniqueName: \"kubernetes.io/projected/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-kube-api-access-q9vf6\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346861 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.346883 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.382423 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.383129 4672 scope.go:117] "RemoveContainer" containerID="9ede9a566346adcbabd817801ad9e25696c1143a09108dedaf87167ca42fa5f3" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.383495 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454387 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454497 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454587 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454624 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454649 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454677 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454698 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9vf6\" (UniqueName: \"kubernetes.io/projected/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-kube-api-access-q9vf6\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.454722 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.457959 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.458434 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.460675 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.474337 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.476472 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.476655 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.478835 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.520989 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9vf6\" (UniqueName: \"kubernetes.io/projected/b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a-kube-api-access-q9vf6\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.521334 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74285663-1d6c-4a5f-8fe4-5406010f7ad7" path="/var/lib/kubelet/pods/74285663-1d6c-4a5f-8fe4-5406010f7ad7/volumes" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.522567 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0df982d-7077-4beb-b89b-71f80d0ff60c" path="/var/lib/kubelet/pods/b0df982d-7077-4beb-b89b-71f80d0ff60c/volumes" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.580379 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a\") " pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.638627 4672 generic.go:334] "Generic (PLEG): container finished" podID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerID="b9bc69dd2b46d0c9dba498dbc9b87ca058be1c0ccc1ce3bb2e23ba57697d55b3" exitCode=137 Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.638871 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8bdf69cc8-lsxz6" event={"ID":"4e153eb6-5f25-4214-8e8a-14c37a36fc06","Type":"ContainerDied","Data":"b9bc69dd2b46d0c9dba498dbc9b87ca058be1c0ccc1ce3bb2e23ba57697d55b3"} Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.697076 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.707061 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.860884 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-scripts\") pod \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.860991 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-secret-key\") pod \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.861041 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-tls-certs\") pod \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.861110 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jf5w\" (UniqueName: \"kubernetes.io/projected/4e153eb6-5f25-4214-8e8a-14c37a36fc06-kube-api-access-7jf5w\") pod \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.861184 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-combined-ca-bundle\") pod \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.861219 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e153eb6-5f25-4214-8e8a-14c37a36fc06-logs\") pod \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.861246 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-config-data\") pod \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\" (UID: \"4e153eb6-5f25-4214-8e8a-14c37a36fc06\") " Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.864752 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e153eb6-5f25-4214-8e8a-14c37a36fc06-logs" (OuterVolumeSpecName: "logs") pod "4e153eb6-5f25-4214-8e8a-14c37a36fc06" (UID: "4e153eb6-5f25-4214-8e8a-14c37a36fc06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.870649 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e153eb6-5f25-4214-8e8a-14c37a36fc06-kube-api-access-7jf5w" (OuterVolumeSpecName: "kube-api-access-7jf5w") pod "4e153eb6-5f25-4214-8e8a-14c37a36fc06" (UID: "4e153eb6-5f25-4214-8e8a-14c37a36fc06"). InnerVolumeSpecName "kube-api-access-7jf5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.891892 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4e153eb6-5f25-4214-8e8a-14c37a36fc06" (UID: "4e153eb6-5f25-4214-8e8a-14c37a36fc06"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.900279 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-scripts" (OuterVolumeSpecName: "scripts") pod "4e153eb6-5f25-4214-8e8a-14c37a36fc06" (UID: "4e153eb6-5f25-4214-8e8a-14c37a36fc06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.919155 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e153eb6-5f25-4214-8e8a-14c37a36fc06" (UID: "4e153eb6-5f25-4214-8e8a-14c37a36fc06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.935619 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-config-data" (OuterVolumeSpecName: "config-data") pod "4e153eb6-5f25-4214-8e8a-14c37a36fc06" (UID: "4e153eb6-5f25-4214-8e8a-14c37a36fc06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963360 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "4e153eb6-5f25-4214-8e8a-14c37a36fc06" (UID: "4e153eb6-5f25-4214-8e8a-14c37a36fc06"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963663 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963693 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e153eb6-5f25-4214-8e8a-14c37a36fc06-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963706 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963718 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e153eb6-5f25-4214-8e8a-14c37a36fc06-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963731 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963741 4672 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e153eb6-5f25-4214-8e8a-14c37a36fc06-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:35 crc kubenswrapper[4672]: I0930 12:40:35.963749 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jf5w\" (UniqueName: \"kubernetes.io/projected/4e153eb6-5f25-4214-8e8a-14c37a36fc06-kube-api-access-7jf5w\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.054495 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:36 crc kubenswrapper[4672]: W0930 12:40:36.096399 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6a15e81_6934_4bba_818b_df43d93cbd0e.slice/crio-6743952ad65a1c6c29961c011f5bd7fec59d05677d87779bec0f79c84bf8253f WatchSource:0}: Error finding container 6743952ad65a1c6c29961c011f5bd7fec59d05677d87779bec0f79c84bf8253f: Status 404 returned error can't find the container with id 6743952ad65a1c6c29961c011f5bd7fec59d05677d87779bec0f79c84bf8253f Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.334113 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 12:40:36 crc kubenswrapper[4672]: W0930 12:40:36.344395 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb221aeb1_8ec1_4805_9f6f_174a3a9ecd8a.slice/crio-9534098e1e3c6b0f334c3baf242495457a8fe1d0499e8a26d009bb8af82360af WatchSource:0}: Error finding container 9534098e1e3c6b0f334c3baf242495457a8fe1d0499e8a26d009bb8af82360af: Status 404 returned error can't find the container with id 9534098e1e3c6b0f334c3baf242495457a8fe1d0499e8a26d009bb8af82360af Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.484145 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.674715 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerStarted","Data":"34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.674762 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerStarted","Data":"2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.674772 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerStarted","Data":"6743952ad65a1c6c29961c011f5bd7fec59d05677d87779bec0f79c84bf8253f"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.675583 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-config\") pod \"6d32cb40-920a-4b27-bf27-9362601aabae\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.675712 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-ovndb-tls-certs\") pod \"6d32cb40-920a-4b27-bf27-9362601aabae\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.675760 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-combined-ca-bundle\") pod \"6d32cb40-920a-4b27-bf27-9362601aabae\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.675822 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-httpd-config\") pod \"6d32cb40-920a-4b27-bf27-9362601aabae\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.675896 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rctx4\" (UniqueName: \"kubernetes.io/projected/6d32cb40-920a-4b27-bf27-9362601aabae-kube-api-access-rctx4\") pod \"6d32cb40-920a-4b27-bf27-9362601aabae\" (UID: \"6d32cb40-920a-4b27-bf27-9362601aabae\") " Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.685248 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6d32cb40-920a-4b27-bf27-9362601aabae" (UID: "6d32cb40-920a-4b27-bf27-9362601aabae"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.688391 4672 generic.go:334] "Generic (PLEG): container finished" podID="6d32cb40-920a-4b27-bf27-9362601aabae" containerID="15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab" exitCode=0 Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.688469 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56684fbfb-t69x4" event={"ID":"6d32cb40-920a-4b27-bf27-9362601aabae","Type":"ContainerDied","Data":"15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.688535 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56684fbfb-t69x4" event={"ID":"6d32cb40-920a-4b27-bf27-9362601aabae","Type":"ContainerDied","Data":"9b938eb24d9bb4f2ed2f5af361cd3c2bab6517eec7201fa60470285c1ac7126a"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.688559 4672 scope.go:117] "RemoveContainer" containerID="eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.688711 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56684fbfb-t69x4" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.689775 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d32cb40-920a-4b27-bf27-9362601aabae-kube-api-access-rctx4" (OuterVolumeSpecName: "kube-api-access-rctx4") pod "6d32cb40-920a-4b27-bf27-9362601aabae" (UID: "6d32cb40-920a-4b27-bf27-9362601aabae"). InnerVolumeSpecName "kube-api-access-rctx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.702620 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8bdf69cc8-lsxz6" event={"ID":"4e153eb6-5f25-4214-8e8a-14c37a36fc06","Type":"ContainerDied","Data":"5d17aeee535d5ed1e97a0ffba135cdb2d653d3c559c871ce2f79cc6be6849d75"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.702656 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8bdf69cc8-lsxz6" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.707093 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4f1bee84-650b-4f0b-a657-e6701ee51823","Type":"ContainerStarted","Data":"62edb181df0e1733ae74f66c8f79e9818a3ce7a19d6f9e981c0c89795c6ab971"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.727243 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a","Type":"ContainerStarted","Data":"9534098e1e3c6b0f334c3baf242495457a8fe1d0499e8a26d009bb8af82360af"} Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.728356 4672 scope.go:117] "RemoveContainer" containerID="15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.762701 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8bdf69cc8-lsxz6"] Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.770081 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8bdf69cc8-lsxz6"] Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.771515 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d32cb40-920a-4b27-bf27-9362601aabae" (UID: "6d32cb40-920a-4b27-bf27-9362601aabae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.778486 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rctx4\" (UniqueName: \"kubernetes.io/projected/6d32cb40-920a-4b27-bf27-9362601aabae-kube-api-access-rctx4\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.778515 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.778525 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.778634 4672 scope.go:117] "RemoveContainer" containerID="eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0" Sep 30 12:40:36 crc kubenswrapper[4672]: E0930 12:40:36.779008 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0\": container with ID starting with eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0 not found: ID does not exist" containerID="eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.779036 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0"} err="failed to get container status \"eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0\": rpc error: code = NotFound desc = could not find container \"eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0\": container with ID starting with eedcd0739a4165cbc9d83126ec509cfa0997545c55c7999884f73e45d31d62e0 not found: ID does not exist" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.779056 4672 scope.go:117] "RemoveContainer" containerID="15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab" Sep 30 12:40:36 crc kubenswrapper[4672]: E0930 12:40:36.780129 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab\": container with ID starting with 15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab not found: ID does not exist" containerID="15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.780154 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab"} err="failed to get container status \"15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab\": rpc error: code = NotFound desc = could not find container \"15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab\": container with ID starting with 15fdddb0b77d09b83380a6be6077d802c2a049b3ac07e92a42275e8023b517ab not found: ID does not exist" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.780171 4672 scope.go:117] "RemoveContainer" containerID="66b370cdbebbcc3bf0abe4218b26bfdbdc8a2747d1cda48afa909303b72ead33" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.783926 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-config" (OuterVolumeSpecName: "config") pod "6d32cb40-920a-4b27-bf27-9362601aabae" (UID: "6d32cb40-920a-4b27-bf27-9362601aabae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.845527 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6d32cb40-920a-4b27-bf27-9362601aabae" (UID: "6d32cb40-920a-4b27-bf27-9362601aabae"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.880753 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:36 crc kubenswrapper[4672]: I0930 12:40:36.880789 4672 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d32cb40-920a-4b27-bf27-9362601aabae-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:37 crc kubenswrapper[4672]: I0930 12:40:37.039342 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56684fbfb-t69x4"] Sep 30 12:40:37 crc kubenswrapper[4672]: I0930 12:40:37.047553 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56684fbfb-t69x4"] Sep 30 12:40:37 crc kubenswrapper[4672]: I0930 12:40:37.093398 4672 scope.go:117] "RemoveContainer" containerID="b9bc69dd2b46d0c9dba498dbc9b87ca058be1c0ccc1ce3bb2e23ba57697d55b3" Sep 30 12:40:37 crc kubenswrapper[4672]: I0930 12:40:37.168165 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:37 crc kubenswrapper[4672]: I0930 12:40:37.436676 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" path="/var/lib/kubelet/pods/4e153eb6-5f25-4214-8e8a-14c37a36fc06/volumes" Sep 30 12:40:37 crc kubenswrapper[4672]: I0930 12:40:37.438020 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" path="/var/lib/kubelet/pods/6d32cb40-920a-4b27-bf27-9362601aabae/volumes" Sep 30 12:40:37 crc kubenswrapper[4672]: I0930 12:40:37.738852 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a","Type":"ContainerStarted","Data":"70910138266e2d46a8e7c12f6898fdba938af2bd82ccfad030b7722c36291096"} Sep 30 12:40:38 crc kubenswrapper[4672]: I0930 12:40:38.774685 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a","Type":"ContainerStarted","Data":"fc3e4de3632c7d2aa7dcc728ce31e9b5f5b212f2c8b685eaa9263e46072a2fd3"} Sep 30 12:40:38 crc kubenswrapper[4672]: I0930 12:40:38.803814 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.803795485 podStartE2EDuration="3.803795485s" podCreationTimestamp="2025-09-30 12:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:38.795938526 +0000 UTC m=+1130.065176182" watchObservedRunningTime="2025-09-30 12:40:38.803795485 +0000 UTC m=+1130.073033131" Sep 30 12:40:39 crc kubenswrapper[4672]: I0930 12:40:39.094710 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 12:40:39 crc kubenswrapper[4672]: I0930 12:40:39.786061 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerStarted","Data":"66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303"} Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.474600 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.475194 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-log" containerID="cri-o://cf68c535814b18f82b62c7e70459b1f0472be9e598870bcc2076ead00dc819b4" gracePeriod=30 Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.475310 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-httpd" containerID="cri-o://47016035c37933ccc06c63f168a8b55b6a70d23103e1681faca987a5aaac055d" gracePeriod=30 Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.808475 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerStarted","Data":"55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a"} Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.808806 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="sg-core" containerID="cri-o://66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303" gracePeriod=30 Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.808748 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="proxy-httpd" containerID="cri-o://55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a" gracePeriod=30 Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.808786 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-notification-agent" containerID="cri-o://34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a" gracePeriod=30 Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.808821 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.808692 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-central-agent" containerID="cri-o://2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c" gracePeriod=30 Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.814632 4672 generic.go:334] "Generic (PLEG): container finished" podID="0e1e7299-4168-4aba-917d-dea25752e400" containerID="cf68c535814b18f82b62c7e70459b1f0472be9e598870bcc2076ead00dc819b4" exitCode=143 Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.814728 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e1e7299-4168-4aba-917d-dea25752e400","Type":"ContainerDied","Data":"cf68c535814b18f82b62c7e70459b1f0472be9e598870bcc2076ead00dc819b4"} Sep 30 12:40:41 crc kubenswrapper[4672]: I0930 12:40:41.833514 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.121389458 podStartE2EDuration="7.833492687s" podCreationTimestamp="2025-09-30 12:40:34 +0000 UTC" firstStartedPulling="2025-09-30 12:40:36.102728159 +0000 UTC m=+1127.371965805" lastFinishedPulling="2025-09-30 12:40:40.814831388 +0000 UTC m=+1132.084069034" observedRunningTime="2025-09-30 12:40:41.827732402 +0000 UTC m=+1133.096970048" watchObservedRunningTime="2025-09-30 12:40:41.833492687 +0000 UTC m=+1133.102730333" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.660806 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9gpdk"] Sep 30 12:40:42 crc kubenswrapper[4672]: E0930 12:40:42.661232 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-api" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661243 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-api" Sep 30 12:40:42 crc kubenswrapper[4672]: E0930 12:40:42.661284 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-httpd" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661291 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-httpd" Sep 30 12:40:42 crc kubenswrapper[4672]: E0930 12:40:42.661312 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon-log" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661318 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon-log" Sep 30 12:40:42 crc kubenswrapper[4672]: E0930 12:40:42.661325 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661330 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661509 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-httpd" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661527 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661550 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d32cb40-920a-4b27-bf27-9362601aabae" containerName="neutron-api" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.661564 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e153eb6-5f25-4214-8e8a-14c37a36fc06" containerName="horizon-log" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.662254 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9gpdk" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.669777 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9gpdk"] Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.771919 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fz28q"] Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.773844 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz28q" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.784478 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fz28q"] Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.802434 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfv5\" (UniqueName: \"kubernetes.io/projected/dc00fe07-7d49-4ca0-818d-fb129880f480-kube-api-access-5dfv5\") pod \"nova-api-db-create-9gpdk\" (UID: \"dc00fe07-7d49-4ca0-818d-fb129880f480\") " pod="openstack/nova-api-db-create-9gpdk" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.825726 4672 generic.go:334] "Generic (PLEG): container finished" podID="0e1e7299-4168-4aba-917d-dea25752e400" containerID="47016035c37933ccc06c63f168a8b55b6a70d23103e1681faca987a5aaac055d" exitCode=0 Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.825801 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e1e7299-4168-4aba-917d-dea25752e400","Type":"ContainerDied","Data":"47016035c37933ccc06c63f168a8b55b6a70d23103e1681faca987a5aaac055d"} Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.843499 4672 generic.go:334] "Generic (PLEG): container finished" podID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerID="55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a" exitCode=0 Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.843533 4672 generic.go:334] "Generic (PLEG): container finished" podID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerID="66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303" exitCode=2 Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.843541 4672 generic.go:334] "Generic (PLEG): container finished" podID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerID="34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a" exitCode=0 Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.843561 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerDied","Data":"55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a"} Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.843586 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerDied","Data":"66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303"} Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.843595 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerDied","Data":"34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a"} Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.862142 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-88sxc"] Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.863370 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-88sxc" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.874813 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-88sxc"] Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.904205 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfv5\" (UniqueName: \"kubernetes.io/projected/dc00fe07-7d49-4ca0-818d-fb129880f480-kube-api-access-5dfv5\") pod \"nova-api-db-create-9gpdk\" (UID: \"dc00fe07-7d49-4ca0-818d-fb129880f480\") " pod="openstack/nova-api-db-create-9gpdk" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.904725 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9stpj\" (UniqueName: \"kubernetes.io/projected/89525b90-403a-4caf-9443-3e041be85426-kube-api-access-9stpj\") pod \"nova-cell0-db-create-fz28q\" (UID: \"89525b90-403a-4caf-9443-3e041be85426\") " pod="openstack/nova-cell0-db-create-fz28q" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.924947 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfv5\" (UniqueName: \"kubernetes.io/projected/dc00fe07-7d49-4ca0-818d-fb129880f480-kube-api-access-5dfv5\") pod \"nova-api-db-create-9gpdk\" (UID: \"dc00fe07-7d49-4ca0-818d-fb129880f480\") " pod="openstack/nova-api-db-create-9gpdk" Sep 30 12:40:42 crc kubenswrapper[4672]: I0930 12:40:42.981596 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9gpdk" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.000422 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.006562 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9stpj\" (UniqueName: \"kubernetes.io/projected/89525b90-403a-4caf-9443-3e041be85426-kube-api-access-9stpj\") pod \"nova-cell0-db-create-fz28q\" (UID: \"89525b90-403a-4caf-9443-3e041be85426\") " pod="openstack/nova-cell0-db-create-fz28q" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.006654 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mjmq\" (UniqueName: \"kubernetes.io/projected/f4260988-bcca-456e-a745-ee5789116a84-kube-api-access-8mjmq\") pod \"nova-cell1-db-create-88sxc\" (UID: \"f4260988-bcca-456e-a745-ee5789116a84\") " pod="openstack/nova-cell1-db-create-88sxc" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.030985 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9stpj\" (UniqueName: \"kubernetes.io/projected/89525b90-403a-4caf-9443-3e041be85426-kube-api-access-9stpj\") pod \"nova-cell0-db-create-fz28q\" (UID: \"89525b90-403a-4caf-9443-3e041be85426\") " pod="openstack/nova-cell0-db-create-fz28q" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.102543 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz28q" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108166 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-combined-ca-bundle\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108280 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-public-tls-certs\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108335 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-logs\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108366 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bghck\" (UniqueName: \"kubernetes.io/projected/0e1e7299-4168-4aba-917d-dea25752e400-kube-api-access-bghck\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108434 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-config-data\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108510 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108592 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-scripts\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.108620 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-httpd-run\") pod \"0e1e7299-4168-4aba-917d-dea25752e400\" (UID: \"0e1e7299-4168-4aba-917d-dea25752e400\") " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.109182 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mjmq\" (UniqueName: \"kubernetes.io/projected/f4260988-bcca-456e-a745-ee5789116a84-kube-api-access-8mjmq\") pod \"nova-cell1-db-create-88sxc\" (UID: \"f4260988-bcca-456e-a745-ee5789116a84\") " pod="openstack/nova-cell1-db-create-88sxc" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.110098 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.110423 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-logs" (OuterVolumeSpecName: "logs") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.112858 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.124506 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1e7299-4168-4aba-917d-dea25752e400-kube-api-access-bghck" (OuterVolumeSpecName: "kube-api-access-bghck") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "kube-api-access-bghck". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.125763 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-scripts" (OuterVolumeSpecName: "scripts") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.127016 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mjmq\" (UniqueName: \"kubernetes.io/projected/f4260988-bcca-456e-a745-ee5789116a84-kube-api-access-8mjmq\") pod \"nova-cell1-db-create-88sxc\" (UID: \"f4260988-bcca-456e-a745-ee5789116a84\") " pod="openstack/nova-cell1-db-create-88sxc" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.211678 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.211717 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.211729 4672 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.211739 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1e7299-4168-4aba-917d-dea25752e400-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.211750 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bghck\" (UniqueName: \"kubernetes.io/projected/0e1e7299-4168-4aba-917d-dea25752e400-kube-api-access-bghck\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.250438 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.266794 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-config-data" (OuterVolumeSpecName: "config-data") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.297029 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-88sxc" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.313587 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.313628 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.314416 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.314472 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e1e7299-4168-4aba-917d-dea25752e400" (UID: "0e1e7299-4168-4aba-917d-dea25752e400"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.416877 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.416913 4672 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1e7299-4168-4aba-917d-dea25752e400-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.623019 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9gpdk"] Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.821128 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fz28q"] Sep 30 12:40:43 crc kubenswrapper[4672]: W0930 12:40:43.821426 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89525b90_403a_4caf_9443_3e041be85426.slice/crio-43de2f83d58bbeba0ded975293898603ce6d15f92a6287521e5f7afd0e4e5cd7 WatchSource:0}: Error finding container 43de2f83d58bbeba0ded975293898603ce6d15f92a6287521e5f7afd0e4e5cd7: Status 404 returned error can't find the container with id 43de2f83d58bbeba0ded975293898603ce6d15f92a6287521e5f7afd0e4e5cd7 Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.875428 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.875402 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e1e7299-4168-4aba-917d-dea25752e400","Type":"ContainerDied","Data":"3461f6d18872c01899f6e3b26a4fe22b34d4302715ddc7b35eee6a4dc685be53"} Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.875873 4672 scope.go:117] "RemoveContainer" containerID="47016035c37933ccc06c63f168a8b55b6a70d23103e1681faca987a5aaac055d" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.894186 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fz28q" event={"ID":"89525b90-403a-4caf-9443-3e041be85426","Type":"ContainerStarted","Data":"43de2f83d58bbeba0ded975293898603ce6d15f92a6287521e5f7afd0e4e5cd7"} Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.896865 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9gpdk" event={"ID":"dc00fe07-7d49-4ca0-818d-fb129880f480","Type":"ContainerStarted","Data":"2892b2b48724eb1990bc8ac7408f7ba6b20bace5d1c2520ba2c5178d1cde4560"} Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.896914 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9gpdk" event={"ID":"dc00fe07-7d49-4ca0-818d-fb129880f480","Type":"ContainerStarted","Data":"f220074250a4962e6d3b5df86132380ab30209b2d286644cc0008daaf5eefff6"} Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.900168 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.924040 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.935486 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-9gpdk" podStartSLOduration=1.935464597 podStartE2EDuration="1.935464597s" podCreationTimestamp="2025-09-30 12:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:43.91896369 +0000 UTC m=+1135.188201336" watchObservedRunningTime="2025-09-30 12:40:43.935464597 +0000 UTC m=+1135.204702243" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.945482 4672 scope.go:117] "RemoveContainer" containerID="cf68c535814b18f82b62c7e70459b1f0472be9e598870bcc2076ead00dc819b4" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.947052 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:40:43 crc kubenswrapper[4672]: E0930 12:40:43.947557 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-log" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.947580 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-log" Sep 30 12:40:43 crc kubenswrapper[4672]: E0930 12:40:43.947600 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-httpd" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.947610 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-httpd" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.947831 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-log" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.947854 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1e7299-4168-4aba-917d-dea25752e400" containerName="glance-httpd" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.949453 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.951516 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.957324 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 12:40:43 crc kubenswrapper[4672]: I0930 12:40:43.984902 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.014745 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-88sxc"] Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.035881 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.035954 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.035972 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4k4b\" (UniqueName: \"kubernetes.io/projected/d7983763-9bc4-4528-9cf4-f2d693c42c5f-kube-api-access-g4k4b\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.035993 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7983763-9bc4-4528-9cf4-f2d693c42c5f-logs\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.036012 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.036037 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7983763-9bc4-4528-9cf4-f2d693c42c5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.036052 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.036072 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138012 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138055 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4k4b\" (UniqueName: \"kubernetes.io/projected/d7983763-9bc4-4528-9cf4-f2d693c42c5f-kube-api-access-g4k4b\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138084 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7983763-9bc4-4528-9cf4-f2d693c42c5f-logs\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138104 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138133 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7983763-9bc4-4528-9cf4-f2d693c42c5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138148 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138164 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.138292 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.140343 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d7983763-9bc4-4528-9cf4-f2d693c42c5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.140672 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.143872 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7983763-9bc4-4528-9cf4-f2d693c42c5f-logs\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.145216 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.146589 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.146729 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.150471 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7983763-9bc4-4528-9cf4-f2d693c42c5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.164245 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4k4b\" (UniqueName: \"kubernetes.io/projected/d7983763-9bc4-4528-9cf4-f2d693c42c5f-kube-api-access-g4k4b\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.202849 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d7983763-9bc4-4528-9cf4-f2d693c42c5f\") " pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.299903 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.728591 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.862230 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-scripts\") pod \"c6a15e81-6934-4bba-818b-df43d93cbd0e\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.863451 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v6p2\" (UniqueName: \"kubernetes.io/projected/c6a15e81-6934-4bba-818b-df43d93cbd0e-kube-api-access-7v6p2\") pod \"c6a15e81-6934-4bba-818b-df43d93cbd0e\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.863684 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-config-data\") pod \"c6a15e81-6934-4bba-818b-df43d93cbd0e\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.863854 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-run-httpd\") pod \"c6a15e81-6934-4bba-818b-df43d93cbd0e\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.864008 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-sg-core-conf-yaml\") pod \"c6a15e81-6934-4bba-818b-df43d93cbd0e\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.864112 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-combined-ca-bundle\") pod \"c6a15e81-6934-4bba-818b-df43d93cbd0e\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.864251 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-log-httpd\") pod \"c6a15e81-6934-4bba-818b-df43d93cbd0e\" (UID: \"c6a15e81-6934-4bba-818b-df43d93cbd0e\") " Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.865645 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c6a15e81-6934-4bba-818b-df43d93cbd0e" (UID: "c6a15e81-6934-4bba-818b-df43d93cbd0e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.902198 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c6a15e81-6934-4bba-818b-df43d93cbd0e" (UID: "c6a15e81-6934-4bba-818b-df43d93cbd0e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.907583 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-scripts" (OuterVolumeSpecName: "scripts") pod "c6a15e81-6934-4bba-818b-df43d93cbd0e" (UID: "c6a15e81-6934-4bba-818b-df43d93cbd0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.907685 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a15e81-6934-4bba-818b-df43d93cbd0e-kube-api-access-7v6p2" (OuterVolumeSpecName: "kube-api-access-7v6p2") pod "c6a15e81-6934-4bba-818b-df43d93cbd0e" (UID: "c6a15e81-6934-4bba-818b-df43d93cbd0e"). InnerVolumeSpecName "kube-api-access-7v6p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.966621 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.966645 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.966654 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v6p2\" (UniqueName: \"kubernetes.io/projected/c6a15e81-6934-4bba-818b-df43d93cbd0e-kube-api-access-7v6p2\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.966666 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c6a15e81-6934-4bba-818b-df43d93cbd0e-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.970609 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c6a15e81-6934-4bba-818b-df43d93cbd0e" (UID: "c6a15e81-6934-4bba-818b-df43d93cbd0e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.974503 4672 generic.go:334] "Generic (PLEG): container finished" podID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerID="2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c" exitCode=0 Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.974667 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerDied","Data":"2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c"} Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.974709 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.974730 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c6a15e81-6934-4bba-818b-df43d93cbd0e","Type":"ContainerDied","Data":"6743952ad65a1c6c29961c011f5bd7fec59d05677d87779bec0f79c84bf8253f"} Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.974761 4672 scope.go:117] "RemoveContainer" containerID="55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a" Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.977361 4672 generic.go:334] "Generic (PLEG): container finished" podID="89525b90-403a-4caf-9443-3e041be85426" containerID="edb9672464a24ea19910f76e267ccd6e0599d16dc31244b6113ec39e94c2f348" exitCode=0 Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.977441 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fz28q" event={"ID":"89525b90-403a-4caf-9443-3e041be85426","Type":"ContainerDied","Data":"edb9672464a24ea19910f76e267ccd6e0599d16dc31244b6113ec39e94c2f348"} Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.982869 4672 generic.go:334] "Generic (PLEG): container finished" podID="f4260988-bcca-456e-a745-ee5789116a84" containerID="40a4bdd26eb215a80d023f5bc8f7474f038dd5c1192cea14e18c5bddfe8626f0" exitCode=0 Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.982967 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-88sxc" event={"ID":"f4260988-bcca-456e-a745-ee5789116a84","Type":"ContainerDied","Data":"40a4bdd26eb215a80d023f5bc8f7474f038dd5c1192cea14e18c5bddfe8626f0"} Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.983014 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-88sxc" event={"ID":"f4260988-bcca-456e-a745-ee5789116a84","Type":"ContainerStarted","Data":"efcb59ebe36c89cf952ce49387ac7cffd2c097508e772fe0feb30027778345fd"} Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.986931 4672 generic.go:334] "Generic (PLEG): container finished" podID="dc00fe07-7d49-4ca0-818d-fb129880f480" containerID="2892b2b48724eb1990bc8ac7408f7ba6b20bace5d1c2520ba2c5178d1cde4560" exitCode=0 Sep 30 12:40:44 crc kubenswrapper[4672]: I0930 12:40:44.986982 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9gpdk" event={"ID":"dc00fe07-7d49-4ca0-818d-fb129880f480","Type":"ContainerDied","Data":"2892b2b48724eb1990bc8ac7408f7ba6b20bace5d1c2520ba2c5178d1cde4560"} Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.047507 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6a15e81-6934-4bba-818b-df43d93cbd0e" (UID: "c6a15e81-6934-4bba-818b-df43d93cbd0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.076388 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.076420 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.083400 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.090451 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-config-data" (OuterVolumeSpecName: "config-data") pod "c6a15e81-6934-4bba-818b-df43d93cbd0e" (UID: "c6a15e81-6934-4bba-818b-df43d93cbd0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.178195 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a15e81-6934-4bba-818b-df43d93cbd0e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.186534 4672 scope.go:117] "RemoveContainer" containerID="66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.208101 4672 scope.go:117] "RemoveContainer" containerID="34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.226578 4672 scope.go:117] "RemoveContainer" containerID="2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.292450 4672 scope.go:117] "RemoveContainer" containerID="55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a" Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.293447 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a\": container with ID starting with 55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a not found: ID does not exist" containerID="55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.293507 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a"} err="failed to get container status \"55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a\": rpc error: code = NotFound desc = could not find container \"55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a\": container with ID starting with 55a7c1f3d44527ef207b2df72b4544080e8c1c4779d50cb372d70b244531b39a not found: ID does not exist" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.293528 4672 scope.go:117] "RemoveContainer" containerID="66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303" Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.293848 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303\": container with ID starting with 66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303 not found: ID does not exist" containerID="66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.293878 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303"} err="failed to get container status \"66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303\": rpc error: code = NotFound desc = could not find container \"66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303\": container with ID starting with 66c976c0fd55d478e431c6cd55789e719ed88e1bef409c95fabc754feca39303 not found: ID does not exist" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.293899 4672 scope.go:117] "RemoveContainer" containerID="34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a" Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.294165 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a\": container with ID starting with 34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a not found: ID does not exist" containerID="34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.294215 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a"} err="failed to get container status \"34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a\": rpc error: code = NotFound desc = could not find container \"34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a\": container with ID starting with 34b1d86c515b99caba9c873f8c732fb65a6702d3f8185e0577f9eabc26c1c21a not found: ID does not exist" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.294248 4672 scope.go:117] "RemoveContainer" containerID="2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c" Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.294581 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c\": container with ID starting with 2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c not found: ID does not exist" containerID="2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.294615 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c"} err="failed to get container status \"2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c\": rpc error: code = NotFound desc = could not find container \"2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c\": container with ID starting with 2ea8295d08c2ad82e72e9839f2f51a033a8f1a727945f7163f4efcf15500fd8c not found: ID does not exist" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.345358 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.357834 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.368353 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.368866 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="sg-core" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.368889 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="sg-core" Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.368923 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-notification-agent" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.368933 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-notification-agent" Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.368944 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-central-agent" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.368951 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-central-agent" Sep 30 12:40:45 crc kubenswrapper[4672]: E0930 12:40:45.368970 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="proxy-httpd" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.368976 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="proxy-httpd" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.369155 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="sg-core" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.369168 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="proxy-httpd" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.369180 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-central-agent" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.369190 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" containerName="ceilometer-notification-agent" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.371148 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.374471 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.374789 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.382526 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.390430 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.448949 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1e7299-4168-4aba-917d-dea25752e400" path="/var/lib/kubelet/pods/0e1e7299-4168-4aba-917d-dea25752e400/volumes" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.450860 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a15e81-6934-4bba-818b-df43d93cbd0e" path="/var/lib/kubelet/pods/c6a15e81-6934-4bba-818b-df43d93cbd0e/volumes" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.451641 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.485871 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.485981 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-config-data\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.486022 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-log-httpd\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.486042 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-scripts\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.486063 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xrd9\" (UniqueName: \"kubernetes.io/projected/dc8b475b-410d-43fb-8d3f-971557bd9421-kube-api-access-5xrd9\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.486082 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.486106 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-run-httpd\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.587738 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-log-httpd\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.587785 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-scripts\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.587829 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xrd9\" (UniqueName: \"kubernetes.io/projected/dc8b475b-410d-43fb-8d3f-971557bd9421-kube-api-access-5xrd9\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.587853 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.587894 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-run-httpd\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.587990 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.588074 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-config-data\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.589274 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-run-httpd\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.589331 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-log-httpd\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.592706 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.593687 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-config-data\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.593935 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.601906 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-scripts\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.609049 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xrd9\" (UniqueName: \"kubernetes.io/projected/dc8b475b-410d-43fb-8d3f-971557bd9421-kube-api-access-5xrd9\") pod \"ceilometer-0\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.707528 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.708856 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.711755 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.754922 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:45 crc kubenswrapper[4672]: I0930 12:40:45.763691 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.024093 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7983763-9bc4-4528-9cf4-f2d693c42c5f","Type":"ContainerStarted","Data":"2a63d6a1875bcedecf87c10dc5e03922020ab17b952218a43b35ea9a19f97053"} Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.024403 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7983763-9bc4-4528-9cf4-f2d693c42c5f","Type":"ContainerStarted","Data":"5a7d25c1cd2b0063575e71ad42c404b96b74097c31c883e861d1c24ff96a42af"} Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.024568 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.024659 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.024673 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.111330 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.214896 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:46 crc kubenswrapper[4672]: W0930 12:40:46.218617 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc8b475b_410d_43fb_8d3f_971557bd9421.slice/crio-4e2598af90b7dfc44541f0261a0f37980750288518cfa22d88ad1dcdb98d25f8 WatchSource:0}: Error finding container 4e2598af90b7dfc44541f0261a0f37980750288518cfa22d88ad1dcdb98d25f8: Status 404 returned error can't find the container with id 4e2598af90b7dfc44541f0261a0f37980750288518cfa22d88ad1dcdb98d25f8 Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.585420 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9gpdk" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.713052 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dfv5\" (UniqueName: \"kubernetes.io/projected/dc00fe07-7d49-4ca0-818d-fb129880f480-kube-api-access-5dfv5\") pod \"dc00fe07-7d49-4ca0-818d-fb129880f480\" (UID: \"dc00fe07-7d49-4ca0-818d-fb129880f480\") " Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.725617 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc00fe07-7d49-4ca0-818d-fb129880f480-kube-api-access-5dfv5" (OuterVolumeSpecName: "kube-api-access-5dfv5") pod "dc00fe07-7d49-4ca0-818d-fb129880f480" (UID: "dc00fe07-7d49-4ca0-818d-fb129880f480"). InnerVolumeSpecName "kube-api-access-5dfv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.734606 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dfv5\" (UniqueName: \"kubernetes.io/projected/dc00fe07-7d49-4ca0-818d-fb129880f480-kube-api-access-5dfv5\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.812898 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz28q" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.870047 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-88sxc" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.939537 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mjmq\" (UniqueName: \"kubernetes.io/projected/f4260988-bcca-456e-a745-ee5789116a84-kube-api-access-8mjmq\") pod \"f4260988-bcca-456e-a745-ee5789116a84\" (UID: \"f4260988-bcca-456e-a745-ee5789116a84\") " Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.939882 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9stpj\" (UniqueName: \"kubernetes.io/projected/89525b90-403a-4caf-9443-3e041be85426-kube-api-access-9stpj\") pod \"89525b90-403a-4caf-9443-3e041be85426\" (UID: \"89525b90-403a-4caf-9443-3e041be85426\") " Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.943405 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4260988-bcca-456e-a745-ee5789116a84-kube-api-access-8mjmq" (OuterVolumeSpecName: "kube-api-access-8mjmq") pod "f4260988-bcca-456e-a745-ee5789116a84" (UID: "f4260988-bcca-456e-a745-ee5789116a84"). InnerVolumeSpecName "kube-api-access-8mjmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:46 crc kubenswrapper[4672]: I0930 12:40:46.944066 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89525b90-403a-4caf-9443-3e041be85426-kube-api-access-9stpj" (OuterVolumeSpecName: "kube-api-access-9stpj") pod "89525b90-403a-4caf-9443-3e041be85426" (UID: "89525b90-403a-4caf-9443-3e041be85426"). InnerVolumeSpecName "kube-api-access-9stpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.043754 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9stpj\" (UniqueName: \"kubernetes.io/projected/89525b90-403a-4caf-9443-3e041be85426-kube-api-access-9stpj\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.043800 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mjmq\" (UniqueName: \"kubernetes.io/projected/f4260988-bcca-456e-a745-ee5789116a84-kube-api-access-8mjmq\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.049158 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerStarted","Data":"8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125"} Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.049214 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerStarted","Data":"4e2598af90b7dfc44541f0261a0f37980750288518cfa22d88ad1dcdb98d25f8"} Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.056436 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fz28q" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.057310 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fz28q" event={"ID":"89525b90-403a-4caf-9443-3e041be85426","Type":"ContainerDied","Data":"43de2f83d58bbeba0ded975293898603ce6d15f92a6287521e5f7afd0e4e5cd7"} Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.057354 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43de2f83d58bbeba0ded975293898603ce6d15f92a6287521e5f7afd0e4e5cd7" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.060735 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-88sxc" event={"ID":"f4260988-bcca-456e-a745-ee5789116a84","Type":"ContainerDied","Data":"efcb59ebe36c89cf952ce49387ac7cffd2c097508e772fe0feb30027778345fd"} Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.060768 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efcb59ebe36c89cf952ce49387ac7cffd2c097508e772fe0feb30027778345fd" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.060825 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-88sxc" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.088891 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d7983763-9bc4-4528-9cf4-f2d693c42c5f","Type":"ContainerStarted","Data":"6212fe07d31002ed207097ae81ce8e1c9e5c47a7965d1b38ee27dcfea0b97e08"} Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.099711 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9gpdk" event={"ID":"dc00fe07-7d49-4ca0-818d-fb129880f480","Type":"ContainerDied","Data":"f220074250a4962e6d3b5df86132380ab30209b2d286644cc0008daaf5eefff6"} Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.099782 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f220074250a4962e6d3b5df86132380ab30209b2d286644cc0008daaf5eefff6" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.099918 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9gpdk" Sep 30 12:40:47 crc kubenswrapper[4672]: I0930 12:40:47.140216 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.14019838 podStartE2EDuration="4.14019838s" podCreationTimestamp="2025-09-30 12:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:40:47.138898647 +0000 UTC m=+1138.408136313" watchObservedRunningTime="2025-09-30 12:40:47.14019838 +0000 UTC m=+1138.409436026" Sep 30 12:40:48 crc kubenswrapper[4672]: I0930 12:40:48.113126 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerStarted","Data":"80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386"} Sep 30 12:40:48 crc kubenswrapper[4672]: I0930 12:40:48.585333 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:48 crc kubenswrapper[4672]: I0930 12:40:48.585711 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:40:48 crc kubenswrapper[4672]: I0930 12:40:48.618515 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 12:40:49 crc kubenswrapper[4672]: I0930 12:40:49.125642 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerStarted","Data":"c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0"} Sep 30 12:40:50 crc kubenswrapper[4672]: I0930 12:40:50.138097 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerStarted","Data":"a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4"} Sep 30 12:40:50 crc kubenswrapper[4672]: I0930 12:40:50.138434 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:40:50 crc kubenswrapper[4672]: I0930 12:40:50.160122 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.923055647 podStartE2EDuration="5.160103906s" podCreationTimestamp="2025-09-30 12:40:45 +0000 UTC" firstStartedPulling="2025-09-30 12:40:46.228383148 +0000 UTC m=+1137.497620784" lastFinishedPulling="2025-09-30 12:40:49.465431387 +0000 UTC m=+1140.734669043" observedRunningTime="2025-09-30 12:40:50.156034243 +0000 UTC m=+1141.425271889" watchObservedRunningTime="2025-09-30 12:40:50.160103906 +0000 UTC m=+1141.429341552" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.907369 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-21c1-account-create-bsmfm"] Sep 30 12:40:52 crc kubenswrapper[4672]: E0930 12:40:52.908848 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc00fe07-7d49-4ca0-818d-fb129880f480" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.908862 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc00fe07-7d49-4ca0-818d-fb129880f480" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: E0930 12:40:52.908876 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89525b90-403a-4caf-9443-3e041be85426" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.908882 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89525b90-403a-4caf-9443-3e041be85426" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: E0930 12:40:52.908908 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4260988-bcca-456e-a745-ee5789116a84" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.908915 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4260988-bcca-456e-a745-ee5789116a84" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.909323 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc00fe07-7d49-4ca0-818d-fb129880f480" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.909528 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="89525b90-403a-4caf-9443-3e041be85426" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.909548 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4260988-bcca-456e-a745-ee5789116a84" containerName="mariadb-database-create" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.910934 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c1-account-create-bsmfm" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.917054 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 12:40:52 crc kubenswrapper[4672]: I0930 12:40:52.919374 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-21c1-account-create-bsmfm"] Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.056820 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2n54\" (UniqueName: \"kubernetes.io/projected/b0dd29d3-91bd-4225-bf8b-c1eb7889b373-kube-api-access-z2n54\") pod \"nova-api-21c1-account-create-bsmfm\" (UID: \"b0dd29d3-91bd-4225-bf8b-c1eb7889b373\") " pod="openstack/nova-api-21c1-account-create-bsmfm" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.076864 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6afc-account-create-tbkp4"] Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.078068 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6afc-account-create-tbkp4" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.081632 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.088243 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6afc-account-create-tbkp4"] Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.158647 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2n54\" (UniqueName: \"kubernetes.io/projected/b0dd29d3-91bd-4225-bf8b-c1eb7889b373-kube-api-access-z2n54\") pod \"nova-api-21c1-account-create-bsmfm\" (UID: \"b0dd29d3-91bd-4225-bf8b-c1eb7889b373\") " pod="openstack/nova-api-21c1-account-create-bsmfm" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.158828 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljbn2\" (UniqueName: \"kubernetes.io/projected/9274280d-7ccf-42c3-a808-ee313827e49f-kube-api-access-ljbn2\") pod \"nova-cell0-6afc-account-create-tbkp4\" (UID: \"9274280d-7ccf-42c3-a808-ee313827e49f\") " pod="openstack/nova-cell0-6afc-account-create-tbkp4" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.187175 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2n54\" (UniqueName: \"kubernetes.io/projected/b0dd29d3-91bd-4225-bf8b-c1eb7889b373-kube-api-access-z2n54\") pod \"nova-api-21c1-account-create-bsmfm\" (UID: \"b0dd29d3-91bd-4225-bf8b-c1eb7889b373\") " pod="openstack/nova-api-21c1-account-create-bsmfm" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.241665 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c1-account-create-bsmfm" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.260893 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljbn2\" (UniqueName: \"kubernetes.io/projected/9274280d-7ccf-42c3-a808-ee313827e49f-kube-api-access-ljbn2\") pod \"nova-cell0-6afc-account-create-tbkp4\" (UID: \"9274280d-7ccf-42c3-a808-ee313827e49f\") " pod="openstack/nova-cell0-6afc-account-create-tbkp4" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.292294 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljbn2\" (UniqueName: \"kubernetes.io/projected/9274280d-7ccf-42c3-a808-ee313827e49f-kube-api-access-ljbn2\") pod \"nova-cell0-6afc-account-create-tbkp4\" (UID: \"9274280d-7ccf-42c3-a808-ee313827e49f\") " pod="openstack/nova-cell0-6afc-account-create-tbkp4" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.297592 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fe4a-account-create-hllt5"] Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.301354 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fe4a-account-create-hllt5" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.305005 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.334226 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fe4a-account-create-hllt5"] Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.398375 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6afc-account-create-tbkp4" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.465645 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrl9r\" (UniqueName: \"kubernetes.io/projected/89348083-7def-4e69-9d40-01729f2c4dfb-kube-api-access-zrl9r\") pod \"nova-cell1-fe4a-account-create-hllt5\" (UID: \"89348083-7def-4e69-9d40-01729f2c4dfb\") " pod="openstack/nova-cell1-fe4a-account-create-hllt5" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.567702 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrl9r\" (UniqueName: \"kubernetes.io/projected/89348083-7def-4e69-9d40-01729f2c4dfb-kube-api-access-zrl9r\") pod \"nova-cell1-fe4a-account-create-hllt5\" (UID: \"89348083-7def-4e69-9d40-01729f2c4dfb\") " pod="openstack/nova-cell1-fe4a-account-create-hllt5" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.586948 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrl9r\" (UniqueName: \"kubernetes.io/projected/89348083-7def-4e69-9d40-01729f2c4dfb-kube-api-access-zrl9r\") pod \"nova-cell1-fe4a-account-create-hllt5\" (UID: \"89348083-7def-4e69-9d40-01729f2c4dfb\") " pod="openstack/nova-cell1-fe4a-account-create-hllt5" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.689782 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fe4a-account-create-hllt5" Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.731149 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-21c1-account-create-bsmfm"] Sep 30 12:40:53 crc kubenswrapper[4672]: I0930 12:40:53.881230 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6afc-account-create-tbkp4"] Sep 30 12:40:53 crc kubenswrapper[4672]: W0930 12:40:53.892648 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9274280d_7ccf_42c3_a808_ee313827e49f.slice/crio-3d48f675bf3b42200d3e652a6bcc98bb1f9c35d8c638dd4d3221da2cd7eaa140 WatchSource:0}: Error finding container 3d48f675bf3b42200d3e652a6bcc98bb1f9c35d8c638dd4d3221da2cd7eaa140: Status 404 returned error can't find the container with id 3d48f675bf3b42200d3e652a6bcc98bb1f9c35d8c638dd4d3221da2cd7eaa140 Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.137216 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fe4a-account-create-hllt5"] Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.184881 4672 generic.go:334] "Generic (PLEG): container finished" podID="b0dd29d3-91bd-4225-bf8b-c1eb7889b373" containerID="42c89d97336c23da4d29a0127fb5c57aa98d8925028c7cd367a3ac75cf1c484f" exitCode=0 Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.185023 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-21c1-account-create-bsmfm" event={"ID":"b0dd29d3-91bd-4225-bf8b-c1eb7889b373","Type":"ContainerDied","Data":"42c89d97336c23da4d29a0127fb5c57aa98d8925028c7cd367a3ac75cf1c484f"} Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.185091 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-21c1-account-create-bsmfm" event={"ID":"b0dd29d3-91bd-4225-bf8b-c1eb7889b373","Type":"ContainerStarted","Data":"dd4ffd6cf5dd2d6523eacfe5b8cf392abe664cf90a88afcab467d59c586e679d"} Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.187126 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fe4a-account-create-hllt5" event={"ID":"89348083-7def-4e69-9d40-01729f2c4dfb","Type":"ContainerStarted","Data":"859e75da25d56047a5eabe4a17425db9743f68a224922105827f04523e7be8e2"} Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.189247 4672 generic.go:334] "Generic (PLEG): container finished" podID="9274280d-7ccf-42c3-a808-ee313827e49f" containerID="5f7f4e9ac114726fc90f4ad308528ac4dbe89f619c61fc2f62de94c83c3c02de" exitCode=0 Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.189309 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6afc-account-create-tbkp4" event={"ID":"9274280d-7ccf-42c3-a808-ee313827e49f","Type":"ContainerDied","Data":"5f7f4e9ac114726fc90f4ad308528ac4dbe89f619c61fc2f62de94c83c3c02de"} Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.189403 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6afc-account-create-tbkp4" event={"ID":"9274280d-7ccf-42c3-a808-ee313827e49f","Type":"ContainerStarted","Data":"3d48f675bf3b42200d3e652a6bcc98bb1f9c35d8c638dd4d3221da2cd7eaa140"} Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.300362 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.300410 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.355149 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 12:40:54 crc kubenswrapper[4672]: I0930 12:40:54.391703 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.209092 4672 generic.go:334] "Generic (PLEG): container finished" podID="89348083-7def-4e69-9d40-01729f2c4dfb" containerID="3bb34e6aadfc75f62d8e46b67fd7d4741bac832693be48a91af15fa2e6e8490e" exitCode=0 Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.209237 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fe4a-account-create-hllt5" event={"ID":"89348083-7def-4e69-9d40-01729f2c4dfb","Type":"ContainerDied","Data":"3bb34e6aadfc75f62d8e46b67fd7d4741bac832693be48a91af15fa2e6e8490e"} Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.209425 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.209508 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.825444 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6afc-account-create-tbkp4" Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.837532 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c1-account-create-bsmfm" Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.917424 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2n54\" (UniqueName: \"kubernetes.io/projected/b0dd29d3-91bd-4225-bf8b-c1eb7889b373-kube-api-access-z2n54\") pod \"b0dd29d3-91bd-4225-bf8b-c1eb7889b373\" (UID: \"b0dd29d3-91bd-4225-bf8b-c1eb7889b373\") " Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.917730 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljbn2\" (UniqueName: \"kubernetes.io/projected/9274280d-7ccf-42c3-a808-ee313827e49f-kube-api-access-ljbn2\") pod \"9274280d-7ccf-42c3-a808-ee313827e49f\" (UID: \"9274280d-7ccf-42c3-a808-ee313827e49f\") " Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.922253 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0dd29d3-91bd-4225-bf8b-c1eb7889b373-kube-api-access-z2n54" (OuterVolumeSpecName: "kube-api-access-z2n54") pod "b0dd29d3-91bd-4225-bf8b-c1eb7889b373" (UID: "b0dd29d3-91bd-4225-bf8b-c1eb7889b373"). InnerVolumeSpecName "kube-api-access-z2n54". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:55 crc kubenswrapper[4672]: I0930 12:40:55.924380 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9274280d-7ccf-42c3-a808-ee313827e49f-kube-api-access-ljbn2" (OuterVolumeSpecName: "kube-api-access-ljbn2") pod "9274280d-7ccf-42c3-a808-ee313827e49f" (UID: "9274280d-7ccf-42c3-a808-ee313827e49f"). InnerVolumeSpecName "kube-api-access-ljbn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.020164 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2n54\" (UniqueName: \"kubernetes.io/projected/b0dd29d3-91bd-4225-bf8b-c1eb7889b373-kube-api-access-z2n54\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.020198 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljbn2\" (UniqueName: \"kubernetes.io/projected/9274280d-7ccf-42c3-a808-ee313827e49f-kube-api-access-ljbn2\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.220029 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-21c1-account-create-bsmfm" event={"ID":"b0dd29d3-91bd-4225-bf8b-c1eb7889b373","Type":"ContainerDied","Data":"dd4ffd6cf5dd2d6523eacfe5b8cf392abe664cf90a88afcab467d59c586e679d"} Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.220781 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd4ffd6cf5dd2d6523eacfe5b8cf392abe664cf90a88afcab467d59c586e679d" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.220066 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-21c1-account-create-bsmfm" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.221605 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6afc-account-create-tbkp4" event={"ID":"9274280d-7ccf-42c3-a808-ee313827e49f","Type":"ContainerDied","Data":"3d48f675bf3b42200d3e652a6bcc98bb1f9c35d8c638dd4d3221da2cd7eaa140"} Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.221637 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d48f675bf3b42200d3e652a6bcc98bb1f9c35d8c638dd4d3221da2cd7eaa140" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.221735 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6afc-account-create-tbkp4" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.629579 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fe4a-account-create-hllt5" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.732454 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrl9r\" (UniqueName: \"kubernetes.io/projected/89348083-7def-4e69-9d40-01729f2c4dfb-kube-api-access-zrl9r\") pod \"89348083-7def-4e69-9d40-01729f2c4dfb\" (UID: \"89348083-7def-4e69-9d40-01729f2c4dfb\") " Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.736438 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89348083-7def-4e69-9d40-01729f2c4dfb-kube-api-access-zrl9r" (OuterVolumeSpecName: "kube-api-access-zrl9r") pod "89348083-7def-4e69-9d40-01729f2c4dfb" (UID: "89348083-7def-4e69-9d40-01729f2c4dfb"). InnerVolumeSpecName "kube-api-access-zrl9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:40:56 crc kubenswrapper[4672]: I0930 12:40:56.834599 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrl9r\" (UniqueName: \"kubernetes.io/projected/89348083-7def-4e69-9d40-01729f2c4dfb-kube-api-access-zrl9r\") on node \"crc\" DevicePath \"\"" Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.231765 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fe4a-account-create-hllt5" event={"ID":"89348083-7def-4e69-9d40-01729f2c4dfb","Type":"ContainerDied","Data":"859e75da25d56047a5eabe4a17425db9743f68a224922105827f04523e7be8e2"} Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.231809 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="859e75da25d56047a5eabe4a17425db9743f68a224922105827f04523e7be8e2" Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.231834 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fe4a-account-create-hllt5" Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.287398 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.287980 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-central-agent" containerID="cri-o://8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125" gracePeriod=30 Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.288387 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="proxy-httpd" containerID="cri-o://a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4" gracePeriod=30 Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.288495 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-notification-agent" containerID="cri-o://80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386" gracePeriod=30 Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.288588 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="sg-core" containerID="cri-o://c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0" gracePeriod=30 Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.543862 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.544222 4672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 12:40:57 crc kubenswrapper[4672]: I0930 12:40:57.544626 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.220048 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l2tgm"] Sep 30 12:40:58 crc kubenswrapper[4672]: E0930 12:40:58.220550 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89348083-7def-4e69-9d40-01729f2c4dfb" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.220567 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="89348083-7def-4e69-9d40-01729f2c4dfb" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: E0930 12:40:58.220585 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0dd29d3-91bd-4225-bf8b-c1eb7889b373" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.220592 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0dd29d3-91bd-4225-bf8b-c1eb7889b373" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: E0930 12:40:58.220613 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9274280d-7ccf-42c3-a808-ee313827e49f" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.220620 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9274280d-7ccf-42c3-a808-ee313827e49f" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.220795 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0dd29d3-91bd-4225-bf8b-c1eb7889b373" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.220815 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9274280d-7ccf-42c3-a808-ee313827e49f" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.220826 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="89348083-7def-4e69-9d40-01729f2c4dfb" containerName="mariadb-account-create" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.225240 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.230205 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.230431 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kj7v8" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.230543 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.242546 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l2tgm"] Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.248139 4672 generic.go:334] "Generic (PLEG): container finished" podID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerID="a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4" exitCode=0 Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.248173 4672 generic.go:334] "Generic (PLEG): container finished" podID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerID="c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0" exitCode=2 Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.248181 4672 generic.go:334] "Generic (PLEG): container finished" podID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerID="8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125" exitCode=0 Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.248182 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerDied","Data":"a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4"} Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.248231 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerDied","Data":"c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0"} Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.248245 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerDied","Data":"8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125"} Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.361147 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-config-data\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.361252 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z2s8\" (UniqueName: \"kubernetes.io/projected/7a77c51f-e03c-4804-ba8c-90507e73e279-kube-api-access-6z2s8\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.362079 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-scripts\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.362175 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.463695 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-config-data\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.463791 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z2s8\" (UniqueName: \"kubernetes.io/projected/7a77c51f-e03c-4804-ba8c-90507e73e279-kube-api-access-6z2s8\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.464252 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-scripts\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.464712 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.469273 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-scripts\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.470011 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-config-data\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.475583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.487757 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z2s8\" (UniqueName: \"kubernetes.io/projected/7a77c51f-e03c-4804-ba8c-90507e73e279-kube-api-access-6z2s8\") pod \"nova-cell0-conductor-db-sync-l2tgm\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.556917 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:40:58 crc kubenswrapper[4672]: I0930 12:40:58.951505 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l2tgm"] Sep 30 12:40:59 crc kubenswrapper[4672]: I0930 12:40:59.267866 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" event={"ID":"7a77c51f-e03c-4804-ba8c-90507e73e279","Type":"ContainerStarted","Data":"3c5eb3d694758cf87484e85d96430a366a95f15d6f1651ae89a4501764678cf6"} Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.093186 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.205029 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-scripts\") pod \"dc8b475b-410d-43fb-8d3f-971557bd9421\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.205212 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-sg-core-conf-yaml\") pod \"dc8b475b-410d-43fb-8d3f-971557bd9421\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.205300 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xrd9\" (UniqueName: \"kubernetes.io/projected/dc8b475b-410d-43fb-8d3f-971557bd9421-kube-api-access-5xrd9\") pod \"dc8b475b-410d-43fb-8d3f-971557bd9421\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.205369 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-config-data\") pod \"dc8b475b-410d-43fb-8d3f-971557bd9421\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.205986 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-combined-ca-bundle\") pod \"dc8b475b-410d-43fb-8d3f-971557bd9421\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.206026 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-log-httpd\") pod \"dc8b475b-410d-43fb-8d3f-971557bd9421\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.206077 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-run-httpd\") pod \"dc8b475b-410d-43fb-8d3f-971557bd9421\" (UID: \"dc8b475b-410d-43fb-8d3f-971557bd9421\") " Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.207714 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc8b475b-410d-43fb-8d3f-971557bd9421" (UID: "dc8b475b-410d-43fb-8d3f-971557bd9421"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.209768 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc8b475b-410d-43fb-8d3f-971557bd9421" (UID: "dc8b475b-410d-43fb-8d3f-971557bd9421"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.213646 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-scripts" (OuterVolumeSpecName: "scripts") pod "dc8b475b-410d-43fb-8d3f-971557bd9421" (UID: "dc8b475b-410d-43fb-8d3f-971557bd9421"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.214777 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8b475b-410d-43fb-8d3f-971557bd9421-kube-api-access-5xrd9" (OuterVolumeSpecName: "kube-api-access-5xrd9") pod "dc8b475b-410d-43fb-8d3f-971557bd9421" (UID: "dc8b475b-410d-43fb-8d3f-971557bd9421"). InnerVolumeSpecName "kube-api-access-5xrd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.250382 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc8b475b-410d-43fb-8d3f-971557bd9421" (UID: "dc8b475b-410d-43fb-8d3f-971557bd9421"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.304791 4672 generic.go:334] "Generic (PLEG): container finished" podID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerID="80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386" exitCode=0 Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.304848 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.304847 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerDied","Data":"80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386"} Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.304977 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc8b475b-410d-43fb-8d3f-971557bd9421","Type":"ContainerDied","Data":"4e2598af90b7dfc44541f0261a0f37980750288518cfa22d88ad1dcdb98d25f8"} Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.305003 4672 scope.go:117] "RemoveContainer" containerID="a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.309436 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.309473 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc8b475b-410d-43fb-8d3f-971557bd9421-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.309486 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.309498 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.309511 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xrd9\" (UniqueName: \"kubernetes.io/projected/dc8b475b-410d-43fb-8d3f-971557bd9421-kube-api-access-5xrd9\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.327570 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc8b475b-410d-43fb-8d3f-971557bd9421" (UID: "dc8b475b-410d-43fb-8d3f-971557bd9421"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.361832 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-config-data" (OuterVolumeSpecName: "config-data") pod "dc8b475b-410d-43fb-8d3f-971557bd9421" (UID: "dc8b475b-410d-43fb-8d3f-971557bd9421"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.394053 4672 scope.go:117] "RemoveContainer" containerID="c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.411527 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.411571 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8b475b-410d-43fb-8d3f-971557bd9421-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.424104 4672 scope.go:117] "RemoveContainer" containerID="80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.457520 4672 scope.go:117] "RemoveContainer" containerID="8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.486542 4672 scope.go:117] "RemoveContainer" containerID="a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4" Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.486993 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4\": container with ID starting with a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4 not found: ID does not exist" containerID="a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.487036 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4"} err="failed to get container status \"a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4\": rpc error: code = NotFound desc = could not find container \"a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4\": container with ID starting with a512cec3646d8110356e46f2160e1d55e53fe96d19683bdc46249aadf8698dd4 not found: ID does not exist" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.487062 4672 scope.go:117] "RemoveContainer" containerID="c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0" Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.487339 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0\": container with ID starting with c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0 not found: ID does not exist" containerID="c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.487364 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0"} err="failed to get container status \"c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0\": rpc error: code = NotFound desc = could not find container \"c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0\": container with ID starting with c0dddcb4e32d4d41d5f92cc24d72df126bd9c5c4689b0e15796a3dc2dd789ee0 not found: ID does not exist" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.487376 4672 scope.go:117] "RemoveContainer" containerID="80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386" Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.487620 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386\": container with ID starting with 80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386 not found: ID does not exist" containerID="80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.487639 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386"} err="failed to get container status \"80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386\": rpc error: code = NotFound desc = could not find container \"80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386\": container with ID starting with 80ef8a8d1b61c69770f5c4ff52ef2efd17ef3e52618d910a34907fbe09fec386 not found: ID does not exist" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.487652 4672 scope.go:117] "RemoveContainer" containerID="8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125" Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.487818 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125\": container with ID starting with 8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125 not found: ID does not exist" containerID="8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.487837 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125"} err="failed to get container status \"8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125\": rpc error: code = NotFound desc = could not find container \"8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125\": container with ID starting with 8d67694a6eee3a0a082827055f75bb86ef41d68b8136220390ca0d17bf8fd125 not found: ID does not exist" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.634815 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.644722 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.660690 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.661065 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-notification-agent" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661083 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-notification-agent" Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.661108 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="proxy-httpd" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661115 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="proxy-httpd" Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.661124 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-central-agent" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661131 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-central-agent" Sep 30 12:41:00 crc kubenswrapper[4672]: E0930 12:41:00.661162 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="sg-core" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661168 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="sg-core" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661447 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="proxy-httpd" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661480 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-central-agent" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661500 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="sg-core" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.661515 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" containerName="ceilometer-notification-agent" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.663125 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.667024 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.667236 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.685335 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.715832 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-run-httpd\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.715960 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.716043 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.716106 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-log-httpd\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.716128 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-config-data\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.716175 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-scripts\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.716202 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5f6\" (UniqueName: \"kubernetes.io/projected/34856d78-c924-4724-bbca-9f3b5ab6e9ad-kube-api-access-gw5f6\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.817888 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-run-httpd\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818002 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818053 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818103 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-config-data\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818117 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-log-httpd\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818154 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-scripts\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818172 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5f6\" (UniqueName: \"kubernetes.io/projected/34856d78-c924-4724-bbca-9f3b5ab6e9ad-kube-api-access-gw5f6\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818502 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-run-httpd\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.818628 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-log-httpd\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.822192 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.822226 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.822370 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-config-data\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.822710 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-scripts\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:00 crc kubenswrapper[4672]: I0930 12:41:00.836040 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5f6\" (UniqueName: \"kubernetes.io/projected/34856d78-c924-4724-bbca-9f3b5ab6e9ad-kube-api-access-gw5f6\") pod \"ceilometer-0\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " pod="openstack/ceilometer-0" Sep 30 12:41:01 crc kubenswrapper[4672]: I0930 12:41:01.024534 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:01 crc kubenswrapper[4672]: I0930 12:41:01.441731 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8b475b-410d-43fb-8d3f-971557bd9421" path="/var/lib/kubelet/pods/dc8b475b-410d-43fb-8d3f-971557bd9421/volumes" Sep 30 12:41:01 crc kubenswrapper[4672]: I0930 12:41:01.506643 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:02 crc kubenswrapper[4672]: I0930 12:41:02.038420 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:02 crc kubenswrapper[4672]: I0930 12:41:02.335219 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerStarted","Data":"c09265842872061a899b0c52b341019d9302b7f667a9a226f5d83645e4109631"} Sep 30 12:41:02 crc kubenswrapper[4672]: I0930 12:41:02.335318 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerStarted","Data":"01d4b897510dd6446effc0c6904cf10d6f870592e795df321b3201576f0dde37"} Sep 30 12:41:02 crc kubenswrapper[4672]: I0930 12:41:02.335334 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerStarted","Data":"32dc50d0fe2239f2d7c406fde0dfa5b66afef338198ee17e3c096fdc83dcdd67"} Sep 30 12:41:03 crc kubenswrapper[4672]: I0930 12:41:03.349397 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerStarted","Data":"34fdc5f5c46ce3eb65e67516212592b95974b63479f736efa193bb7b8a302e5f"} Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.433237 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerStarted","Data":"8104e13c73285dd412b983216ceeb1293cdb470a134e07a7f30a5dcba8888c27"} Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.434007 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-central-agent" containerID="cri-o://01d4b897510dd6446effc0c6904cf10d6f870592e795df321b3201576f0dde37" gracePeriod=30 Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.434243 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.434303 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="proxy-httpd" containerID="cri-o://8104e13c73285dd412b983216ceeb1293cdb470a134e07a7f30a5dcba8888c27" gracePeriod=30 Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.434359 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="sg-core" containerID="cri-o://34fdc5f5c46ce3eb65e67516212592b95974b63479f736efa193bb7b8a302e5f" gracePeriod=30 Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.434401 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-notification-agent" containerID="cri-o://c09265842872061a899b0c52b341019d9302b7f667a9a226f5d83645e4109631" gracePeriod=30 Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.437417 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" event={"ID":"7a77c51f-e03c-4804-ba8c-90507e73e279","Type":"ContainerStarted","Data":"7ce3018771e881c787448cfb8aa5e4c68216ae246eeedac5ef5d4dbf1d8c4f22"} Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.485657 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" podStartSLOduration=1.337996516 podStartE2EDuration="11.485632692s" podCreationTimestamp="2025-09-30 12:40:58 +0000 UTC" firstStartedPulling="2025-09-30 12:40:58.948446971 +0000 UTC m=+1150.217684617" lastFinishedPulling="2025-09-30 12:41:09.096083107 +0000 UTC m=+1160.365320793" observedRunningTime="2025-09-30 12:41:09.477011834 +0000 UTC m=+1160.746249480" watchObservedRunningTime="2025-09-30 12:41:09.485632692 +0000 UTC m=+1160.754870338" Sep 30 12:41:09 crc kubenswrapper[4672]: I0930 12:41:09.512424 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9230571840000001 podStartE2EDuration="9.512402088s" podCreationTimestamp="2025-09-30 12:41:00 +0000 UTC" firstStartedPulling="2025-09-30 12:41:01.504329022 +0000 UTC m=+1152.773566668" lastFinishedPulling="2025-09-30 12:41:09.093673926 +0000 UTC m=+1160.362911572" observedRunningTime="2025-09-30 12:41:09.504248332 +0000 UTC m=+1160.773485988" watchObservedRunningTime="2025-09-30 12:41:09.512402088 +0000 UTC m=+1160.781639734" Sep 30 12:41:10 crc kubenswrapper[4672]: I0930 12:41:10.453571 4672 generic.go:334] "Generic (PLEG): container finished" podID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerID="34fdc5f5c46ce3eb65e67516212592b95974b63479f736efa193bb7b8a302e5f" exitCode=2 Sep 30 12:41:10 crc kubenswrapper[4672]: I0930 12:41:10.453695 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerDied","Data":"34fdc5f5c46ce3eb65e67516212592b95974b63479f736efa193bb7b8a302e5f"} Sep 30 12:41:11 crc kubenswrapper[4672]: I0930 12:41:11.468973 4672 generic.go:334] "Generic (PLEG): container finished" podID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerID="c09265842872061a899b0c52b341019d9302b7f667a9a226f5d83645e4109631" exitCode=0 Sep 30 12:41:11 crc kubenswrapper[4672]: I0930 12:41:11.469010 4672 generic.go:334] "Generic (PLEG): container finished" podID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerID="01d4b897510dd6446effc0c6904cf10d6f870592e795df321b3201576f0dde37" exitCode=0 Sep 30 12:41:11 crc kubenswrapper[4672]: I0930 12:41:11.469021 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerDied","Data":"c09265842872061a899b0c52b341019d9302b7f667a9a226f5d83645e4109631"} Sep 30 12:41:11 crc kubenswrapper[4672]: I0930 12:41:11.469081 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerDied","Data":"01d4b897510dd6446effc0c6904cf10d6f870592e795df321b3201576f0dde37"} Sep 30 12:41:23 crc kubenswrapper[4672]: I0930 12:41:23.591922 4672 generic.go:334] "Generic (PLEG): container finished" podID="7a77c51f-e03c-4804-ba8c-90507e73e279" containerID="7ce3018771e881c787448cfb8aa5e4c68216ae246eeedac5ef5d4dbf1d8c4f22" exitCode=0 Sep 30 12:41:23 crc kubenswrapper[4672]: I0930 12:41:23.592052 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" event={"ID":"7a77c51f-e03c-4804-ba8c-90507e73e279","Type":"ContainerDied","Data":"7ce3018771e881c787448cfb8aa5e4c68216ae246eeedac5ef5d4dbf1d8c4f22"} Sep 30 12:41:24 crc kubenswrapper[4672]: I0930 12:41:24.739672 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:41:24 crc kubenswrapper[4672]: I0930 12:41:24.740794 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.055367 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.195944 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z2s8\" (UniqueName: \"kubernetes.io/projected/7a77c51f-e03c-4804-ba8c-90507e73e279-kube-api-access-6z2s8\") pod \"7a77c51f-e03c-4804-ba8c-90507e73e279\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.195993 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-config-data\") pod \"7a77c51f-e03c-4804-ba8c-90507e73e279\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.196020 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-combined-ca-bundle\") pod \"7a77c51f-e03c-4804-ba8c-90507e73e279\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.196061 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-scripts\") pod \"7a77c51f-e03c-4804-ba8c-90507e73e279\" (UID: \"7a77c51f-e03c-4804-ba8c-90507e73e279\") " Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.202249 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a77c51f-e03c-4804-ba8c-90507e73e279-kube-api-access-6z2s8" (OuterVolumeSpecName: "kube-api-access-6z2s8") pod "7a77c51f-e03c-4804-ba8c-90507e73e279" (UID: "7a77c51f-e03c-4804-ba8c-90507e73e279"). InnerVolumeSpecName "kube-api-access-6z2s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.204406 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-scripts" (OuterVolumeSpecName: "scripts") pod "7a77c51f-e03c-4804-ba8c-90507e73e279" (UID: "7a77c51f-e03c-4804-ba8c-90507e73e279"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.233095 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a77c51f-e03c-4804-ba8c-90507e73e279" (UID: "7a77c51f-e03c-4804-ba8c-90507e73e279"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.235684 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-config-data" (OuterVolumeSpecName: "config-data") pod "7a77c51f-e03c-4804-ba8c-90507e73e279" (UID: "7a77c51f-e03c-4804-ba8c-90507e73e279"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.299419 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z2s8\" (UniqueName: \"kubernetes.io/projected/7a77c51f-e03c-4804-ba8c-90507e73e279-kube-api-access-6z2s8\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.299489 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.299519 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.299542 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a77c51f-e03c-4804-ba8c-90507e73e279-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.617547 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" event={"ID":"7a77c51f-e03c-4804-ba8c-90507e73e279","Type":"ContainerDied","Data":"3c5eb3d694758cf87484e85d96430a366a95f15d6f1651ae89a4501764678cf6"} Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.617590 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c5eb3d694758cf87484e85d96430a366a95f15d6f1651ae89a4501764678cf6" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.618005 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l2tgm" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.707263 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 12:41:25 crc kubenswrapper[4672]: E0930 12:41:25.707944 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a77c51f-e03c-4804-ba8c-90507e73e279" containerName="nova-cell0-conductor-db-sync" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.707961 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a77c51f-e03c-4804-ba8c-90507e73e279" containerName="nova-cell0-conductor-db-sync" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.708172 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a77c51f-e03c-4804-ba8c-90507e73e279" containerName="nova-cell0-conductor-db-sync" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.708784 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.714190 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.714392 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kj7v8" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.733654 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.809041 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdadbc89-4050-4b7f-bf2b-70e405b18974-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.810365 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgzj\" (UniqueName: \"kubernetes.io/projected/fdadbc89-4050-4b7f-bf2b-70e405b18974-kube-api-access-bfgzj\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.810599 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdadbc89-4050-4b7f-bf2b-70e405b18974-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.912706 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgzj\" (UniqueName: \"kubernetes.io/projected/fdadbc89-4050-4b7f-bf2b-70e405b18974-kube-api-access-bfgzj\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.912814 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdadbc89-4050-4b7f-bf2b-70e405b18974-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.912939 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdadbc89-4050-4b7f-bf2b-70e405b18974-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.928408 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdadbc89-4050-4b7f-bf2b-70e405b18974-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.930035 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdadbc89-4050-4b7f-bf2b-70e405b18974-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:25 crc kubenswrapper[4672]: I0930 12:41:25.945016 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgzj\" (UniqueName: \"kubernetes.io/projected/fdadbc89-4050-4b7f-bf2b-70e405b18974-kube-api-access-bfgzj\") pod \"nova-cell0-conductor-0\" (UID: \"fdadbc89-4050-4b7f-bf2b-70e405b18974\") " pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:26 crc kubenswrapper[4672]: I0930 12:41:26.039191 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:26 crc kubenswrapper[4672]: I0930 12:41:26.549057 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 12:41:26 crc kubenswrapper[4672]: I0930 12:41:26.633327 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdadbc89-4050-4b7f-bf2b-70e405b18974","Type":"ContainerStarted","Data":"d4e1468251bea4c3467e532e003d40364070a31c47e08f8ad226abf3f0b56ffc"} Sep 30 12:41:27 crc kubenswrapper[4672]: I0930 12:41:27.648405 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdadbc89-4050-4b7f-bf2b-70e405b18974","Type":"ContainerStarted","Data":"e19fc5ca2c366bee93f1b4a94a63a407c8ba0b6adcca63a9c50403ad8d894930"} Sep 30 12:41:27 crc kubenswrapper[4672]: I0930 12:41:27.649145 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:27 crc kubenswrapper[4672]: I0930 12:41:27.669873 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.669854743 podStartE2EDuration="2.669854743s" podCreationTimestamp="2025-09-30 12:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:27.66460062 +0000 UTC m=+1178.933838266" watchObservedRunningTime="2025-09-30 12:41:27.669854743 +0000 UTC m=+1178.939092389" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.028823 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.075081 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.591045 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-zmhvt"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.593058 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.597340 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.602539 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-zmhvt"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.604853 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.761089 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.761153 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45vhb\" (UniqueName: \"kubernetes.io/projected/645ec6e3-a10c-4651-9df2-a8259bcd51b9-kube-api-access-45vhb\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.761243 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-config-data\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.761404 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-scripts\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.775737 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.777630 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.791323 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.793068 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.797238 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.798250 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.808313 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.845187 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864019 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864097 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6btkf\" (UniqueName: \"kubernetes.io/projected/e28fe302-599f-4788-a17f-78c90cfca670-kube-api-access-6btkf\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864130 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28fe302-599f-4788-a17f-78c90cfca670-logs\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864255 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-scripts\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864334 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864353 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45vhb\" (UniqueName: \"kubernetes.io/projected/645ec6e3-a10c-4651-9df2-a8259bcd51b9-kube-api-access-45vhb\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864446 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-config-data\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.864498 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-config-data\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.884024 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.884084 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-config-data\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.901895 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45vhb\" (UniqueName: \"kubernetes.io/projected/645ec6e3-a10c-4651-9df2-a8259bcd51b9-kube-api-access-45vhb\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.919152 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-scripts\") pod \"nova-cell0-cell-mapping-zmhvt\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.939012 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.941687 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.955525 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.962068 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.966896 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-config-data\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.966985 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnjlj\" (UniqueName: \"kubernetes.io/projected/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-kube-api-access-dnjlj\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.967018 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.967055 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.967092 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6btkf\" (UniqueName: \"kubernetes.io/projected/e28fe302-599f-4788-a17f-78c90cfca670-kube-api-access-6btkf\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.967122 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28fe302-599f-4788-a17f-78c90cfca670-logs\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.967167 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.971153 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28fe302-599f-4788-a17f-78c90cfca670-logs\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.975603 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.989684 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.993914 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-config-data\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:31 crc kubenswrapper[4672]: I0930 12:41:31.996990 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6btkf\" (UniqueName: \"kubernetes.io/projected/e28fe302-599f-4788-a17f-78c90cfca670-kube-api-access-6btkf\") pod \"nova-api-0\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " pod="openstack/nova-api-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.026409 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.034703 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.039778 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.080416 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.080486 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnjlj\" (UniqueName: \"kubernetes.io/projected/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-kube-api-access-dnjlj\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.080551 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.080644 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96mgb\" (UniqueName: \"kubernetes.io/projected/50605b9a-2e73-461e-9151-b47883bb9b9e-kube-api-access-96mgb\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.080679 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.080759 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-config-data\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.085068 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.092486 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.093695 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.098579 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.143805 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnjlj\" (UniqueName: \"kubernetes.io/projected/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-kube-api-access-dnjlj\") pod \"nova-cell1-novncproxy-0\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.159667 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849fbb457f-6lw7x"] Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.165509 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.181718 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fbb457f-6lw7x"] Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.182032 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsj52\" (UniqueName: \"kubernetes.io/projected/231eac71-05b0-4761-83f2-c195b6307bd9-kube-api-access-lsj52\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.182097 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.182146 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.182185 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231eac71-05b0-4761-83f2-c195b6307bd9-logs\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.182218 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96mgb\" (UniqueName: \"kubernetes.io/projected/50605b9a-2e73-461e-9151-b47883bb9b9e-kube-api-access-96mgb\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.182257 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-config-data\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.182308 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-config-data\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.189503 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-config-data\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.194096 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.205912 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96mgb\" (UniqueName: \"kubernetes.io/projected/50605b9a-2e73-461e-9151-b47883bb9b9e-kube-api-access-96mgb\") pod \"nova-scheduler-0\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.284890 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flnlg\" (UniqueName: \"kubernetes.io/projected/becf3fb1-512b-41ac-bfa1-e0da6204bcda-kube-api-access-flnlg\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285309 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-sb\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285337 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-config-data\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285403 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-svc\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285464 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsj52\" (UniqueName: \"kubernetes.io/projected/231eac71-05b0-4761-83f2-c195b6307bd9-kube-api-access-lsj52\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285484 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-nb\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285726 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285755 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-swift-storage-0\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285797 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-config\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.285939 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231eac71-05b0-4761-83f2-c195b6307bd9-logs\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.286338 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231eac71-05b0-4761-83f2-c195b6307bd9-logs\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.293016 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-config-data\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.299944 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.305601 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsj52\" (UniqueName: \"kubernetes.io/projected/231eac71-05b0-4761-83f2-c195b6307bd9-kube-api-access-lsj52\") pod \"nova-metadata-0\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.387613 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flnlg\" (UniqueName: \"kubernetes.io/projected/becf3fb1-512b-41ac-bfa1-e0da6204bcda-kube-api-access-flnlg\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.387699 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-sb\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.387748 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-svc\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.387776 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-nb\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.387841 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-swift-storage-0\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.387860 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-config\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.388957 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-config\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.390401 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-sb\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.390761 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-swift-storage-0\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.390796 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-nb\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.390771 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-svc\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.416807 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flnlg\" (UniqueName: \"kubernetes.io/projected/becf3fb1-512b-41ac-bfa1-e0da6204bcda-kube-api-access-flnlg\") pod \"dnsmasq-dns-849fbb457f-6lw7x\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.421851 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.459760 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.468872 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.515909 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.714631 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-zmhvt"] Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.843121 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.986051 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hmsx"] Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.988861 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.996840 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 12:41:32 crc kubenswrapper[4672]: I0930 12:41:32.997104 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.020761 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hmsx"] Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.038691 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-config-data\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.038766 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-scripts\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.038793 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.038909 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxnxj\" (UniqueName: \"kubernetes.io/projected/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-kube-api-access-dxnxj\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.072532 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.140964 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-config-data\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.141028 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-scripts\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.141054 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.141105 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxnxj\" (UniqueName: \"kubernetes.io/projected/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-kube-api-access-dxnxj\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.145773 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-config-data\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.146374 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-scripts\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.149110 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.159448 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxnxj\" (UniqueName: \"kubernetes.io/projected/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-kube-api-access-dxnxj\") pod \"nova-cell1-conductor-db-sync-8hmsx\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.272206 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.292056 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.358774 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:33 crc kubenswrapper[4672]: W0930 12:41:33.473411 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbecf3fb1_512b_41ac_bfa1_e0da6204bcda.slice/crio-d8b681f7cfc279b7f90d1a590b8bd0749b24919c0fb5bb14a2149445db745e81 WatchSource:0}: Error finding container d8b681f7cfc279b7f90d1a590b8bd0749b24919c0fb5bb14a2149445db745e81: Status 404 returned error can't find the container with id d8b681f7cfc279b7f90d1a590b8bd0749b24919c0fb5bb14a2149445db745e81 Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.499929 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fbb457f-6lw7x"] Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.734094 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50605b9a-2e73-461e-9151-b47883bb9b9e","Type":"ContainerStarted","Data":"1a995aa135f0fa05fd2376eb2c4f1bc50003d4e6aafc61f74f6dcdaaf05f1e92"} Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.759332 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" event={"ID":"becf3fb1-512b-41ac-bfa1-e0da6204bcda","Type":"ContainerStarted","Data":"d8b681f7cfc279b7f90d1a590b8bd0749b24919c0fb5bb14a2149445db745e81"} Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.768796 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"231eac71-05b0-4761-83f2-c195b6307bd9","Type":"ContainerStarted","Data":"1f252b54167014c0acd1c5c865dc7899d100bb68e188f095de54724e12c06dd9"} Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.771652 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fba6b3f9-5728-4c5b-955a-571d3a8c83f4","Type":"ContainerStarted","Data":"b01598d6f664d7bd5113a90aa5af994815d5021c3db250fe3f3b24ec3add0d9f"} Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.772883 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e28fe302-599f-4788-a17f-78c90cfca670","Type":"ContainerStarted","Data":"9491639a9260cff247244696b45ade581f4040d89bd2baf96ca190ea4c867e9f"} Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.796001 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zmhvt" event={"ID":"645ec6e3-a10c-4651-9df2-a8259bcd51b9","Type":"ContainerStarted","Data":"31203d1a1f3ea529a7e5cf481e28fe0ad9c2807c8193e97f31a80a9df1538278"} Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.796072 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zmhvt" event={"ID":"645ec6e3-a10c-4651-9df2-a8259bcd51b9","Type":"ContainerStarted","Data":"0a50b1c9388b1ed01291d5b0835fe34f7e631c1d384c3e2468ec92e378ce7cf5"} Sep 30 12:41:33 crc kubenswrapper[4672]: I0930 12:41:33.819216 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-zmhvt" podStartSLOduration=2.81919771 podStartE2EDuration="2.81919771s" podCreationTimestamp="2025-09-30 12:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:33.810702115 +0000 UTC m=+1185.079939761" watchObservedRunningTime="2025-09-30 12:41:33.81919771 +0000 UTC m=+1185.088435356" Sep 30 12:41:34 crc kubenswrapper[4672]: I0930 12:41:34.142525 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hmsx"] Sep 30 12:41:34 crc kubenswrapper[4672]: I0930 12:41:34.822327 4672 generic.go:334] "Generic (PLEG): container finished" podID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerID="1e4b1d648a001af42c4fc3edcfff3ce6c47e4a4719990f91e36c987ed2aae05e" exitCode=0 Sep 30 12:41:34 crc kubenswrapper[4672]: I0930 12:41:34.822582 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" event={"ID":"becf3fb1-512b-41ac-bfa1-e0da6204bcda","Type":"ContainerDied","Data":"1e4b1d648a001af42c4fc3edcfff3ce6c47e4a4719990f91e36c987ed2aae05e"} Sep 30 12:41:35 crc kubenswrapper[4672]: I0930 12:41:35.374333 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:35 crc kubenswrapper[4672]: I0930 12:41:35.390829 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:41:36 crc kubenswrapper[4672]: I0930 12:41:36.856279 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" event={"ID":"9858e9f6-3e7a-48e8-8557-eabc1ccfada4","Type":"ContainerStarted","Data":"0b2e868a61283fcabc0888a0f8baa7796e9d596441029c23e51dda52ca3a79f7"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.868694 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"231eac71-05b0-4761-83f2-c195b6307bd9","Type":"ContainerStarted","Data":"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.869394 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"231eac71-05b0-4761-83f2-c195b6307bd9","Type":"ContainerStarted","Data":"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.868844 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-log" containerID="cri-o://2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f" gracePeriod=30 Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.868791 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-metadata" containerID="cri-o://c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d" gracePeriod=30 Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.871905 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fba6b3f9-5728-4c5b-955a-571d3a8c83f4","Type":"ContainerStarted","Data":"4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.871971 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fba6b3f9-5728-4c5b-955a-571d3a8c83f4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769" gracePeriod=30 Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.876027 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e28fe302-599f-4788-a17f-78c90cfca670","Type":"ContainerStarted","Data":"325505f1235b6cb8ebe38e2e6ec854dc0b69aece8402b899cbcfd1482667cd98"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.876073 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e28fe302-599f-4788-a17f-78c90cfca670","Type":"ContainerStarted","Data":"7cb2155e7bc3424d896ede691e1ed23b1d989ff67288540ca0fa90e36da4f507"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.878381 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50605b9a-2e73-461e-9151-b47883bb9b9e","Type":"ContainerStarted","Data":"8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.880112 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" event={"ID":"9858e9f6-3e7a-48e8-8557-eabc1ccfada4","Type":"ContainerStarted","Data":"8870f5bddbde13fa494016a4544314123d701c79bf980e4762681b821dc4225b"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.892667 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" event={"ID":"becf3fb1-512b-41ac-bfa1-e0da6204bcda","Type":"ContainerStarted","Data":"727b7ddf43148d767a5094fce03761f89c3e9425add9803090efbd4bf7f2eed1"} Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.892792 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.902903 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.516824443 podStartE2EDuration="6.902877884s" podCreationTimestamp="2025-09-30 12:41:31 +0000 UTC" firstStartedPulling="2025-09-30 12:41:33.319058463 +0000 UTC m=+1184.588296109" lastFinishedPulling="2025-09-30 12:41:36.705111904 +0000 UTC m=+1187.974349550" observedRunningTime="2025-09-30 12:41:37.898257947 +0000 UTC m=+1189.167495593" watchObservedRunningTime="2025-09-30 12:41:37.902877884 +0000 UTC m=+1189.172115540" Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.923672 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" podStartSLOduration=5.923647208 podStartE2EDuration="5.923647208s" podCreationTimestamp="2025-09-30 12:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:37.919604166 +0000 UTC m=+1189.188841812" watchObservedRunningTime="2025-09-30 12:41:37.923647208 +0000 UTC m=+1189.192884874" Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.937747 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.122620151 podStartE2EDuration="6.937723224s" podCreationTimestamp="2025-09-30 12:41:31 +0000 UTC" firstStartedPulling="2025-09-30 12:41:32.889818596 +0000 UTC m=+1184.159056242" lastFinishedPulling="2025-09-30 12:41:36.704921669 +0000 UTC m=+1187.974159315" observedRunningTime="2025-09-30 12:41:37.934846981 +0000 UTC m=+1189.204084627" watchObservedRunningTime="2025-09-30 12:41:37.937723224 +0000 UTC m=+1189.206960880" Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.959412 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.551050268 podStartE2EDuration="6.959391551s" podCreationTimestamp="2025-09-30 12:41:31 +0000 UTC" firstStartedPulling="2025-09-30 12:41:33.294808011 +0000 UTC m=+1184.564045657" lastFinishedPulling="2025-09-30 12:41:36.703149294 +0000 UTC m=+1187.972386940" observedRunningTime="2025-09-30 12:41:37.955618626 +0000 UTC m=+1189.224856272" watchObservedRunningTime="2025-09-30 12:41:37.959391551 +0000 UTC m=+1189.228629197" Sep 30 12:41:37 crc kubenswrapper[4672]: I0930 12:41:37.995617 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.410183601 podStartE2EDuration="6.995595075s" podCreationTimestamp="2025-09-30 12:41:31 +0000 UTC" firstStartedPulling="2025-09-30 12:41:33.080134161 +0000 UTC m=+1184.349371807" lastFinishedPulling="2025-09-30 12:41:36.665545635 +0000 UTC m=+1187.934783281" observedRunningTime="2025-09-30 12:41:37.986752142 +0000 UTC m=+1189.255989788" watchObservedRunningTime="2025-09-30 12:41:37.995595075 +0000 UTC m=+1189.264832741" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.015134 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" podStartSLOduration=6.015110608 podStartE2EDuration="6.015110608s" podCreationTimestamp="2025-09-30 12:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:38.005103785 +0000 UTC m=+1189.274341431" watchObservedRunningTime="2025-09-30 12:41:38.015110608 +0000 UTC m=+1189.284348254" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.898120 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.903868 4672 generic.go:334] "Generic (PLEG): container finished" podID="231eac71-05b0-4761-83f2-c195b6307bd9" containerID="c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d" exitCode=0 Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.903913 4672 generic.go:334] "Generic (PLEG): container finished" podID="231eac71-05b0-4761-83f2-c195b6307bd9" containerID="2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f" exitCode=143 Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.903973 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.904043 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"231eac71-05b0-4761-83f2-c195b6307bd9","Type":"ContainerDied","Data":"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d"} Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.904085 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"231eac71-05b0-4761-83f2-c195b6307bd9","Type":"ContainerDied","Data":"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f"} Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.904098 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"231eac71-05b0-4761-83f2-c195b6307bd9","Type":"ContainerDied","Data":"1f252b54167014c0acd1c5c865dc7899d100bb68e188f095de54724e12c06dd9"} Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.904123 4672 scope.go:117] "RemoveContainer" containerID="c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.924960 4672 scope.go:117] "RemoveContainer" containerID="2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.973143 4672 scope.go:117] "RemoveContainer" containerID="c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d" Sep 30 12:41:38 crc kubenswrapper[4672]: E0930 12:41:38.976161 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d\": container with ID starting with c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d not found: ID does not exist" containerID="c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.976203 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d"} err="failed to get container status \"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d\": rpc error: code = NotFound desc = could not find container \"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d\": container with ID starting with c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d not found: ID does not exist" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.976227 4672 scope.go:117] "RemoveContainer" containerID="2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f" Sep 30 12:41:38 crc kubenswrapper[4672]: E0930 12:41:38.976908 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f\": container with ID starting with 2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f not found: ID does not exist" containerID="2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.976957 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f"} err="failed to get container status \"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f\": rpc error: code = NotFound desc = could not find container \"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f\": container with ID starting with 2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f not found: ID does not exist" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.976988 4672 scope.go:117] "RemoveContainer" containerID="c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.977467 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d"} err="failed to get container status \"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d\": rpc error: code = NotFound desc = could not find container \"c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d\": container with ID starting with c0285c1adf649a1aa8ca1fa45cf0dd7cd68e43e7007ebd593146cdccc2aec13d not found: ID does not exist" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.977526 4672 scope.go:117] "RemoveContainer" containerID="2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f" Sep 30 12:41:38 crc kubenswrapper[4672]: I0930 12:41:38.978371 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f"} err="failed to get container status \"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f\": rpc error: code = NotFound desc = could not find container \"2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f\": container with ID starting with 2aa55b932299e9798d246fbb29fa3dfa58a2665674c8dc81892e565d1e93ad3f not found: ID does not exist" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.079273 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-combined-ca-bundle\") pod \"231eac71-05b0-4761-83f2-c195b6307bd9\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.079557 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-config-data\") pod \"231eac71-05b0-4761-83f2-c195b6307bd9\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.079724 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsj52\" (UniqueName: \"kubernetes.io/projected/231eac71-05b0-4761-83f2-c195b6307bd9-kube-api-access-lsj52\") pod \"231eac71-05b0-4761-83f2-c195b6307bd9\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.079913 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231eac71-05b0-4761-83f2-c195b6307bd9-logs\") pod \"231eac71-05b0-4761-83f2-c195b6307bd9\" (UID: \"231eac71-05b0-4761-83f2-c195b6307bd9\") " Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.083555 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231eac71-05b0-4761-83f2-c195b6307bd9-logs" (OuterVolumeSpecName: "logs") pod "231eac71-05b0-4761-83f2-c195b6307bd9" (UID: "231eac71-05b0-4761-83f2-c195b6307bd9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.089929 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231eac71-05b0-4761-83f2-c195b6307bd9-kube-api-access-lsj52" (OuterVolumeSpecName: "kube-api-access-lsj52") pod "231eac71-05b0-4761-83f2-c195b6307bd9" (UID: "231eac71-05b0-4761-83f2-c195b6307bd9"). InnerVolumeSpecName "kube-api-access-lsj52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.128950 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "231eac71-05b0-4761-83f2-c195b6307bd9" (UID: "231eac71-05b0-4761-83f2-c195b6307bd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.147339 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-config-data" (OuterVolumeSpecName: "config-data") pod "231eac71-05b0-4761-83f2-c195b6307bd9" (UID: "231eac71-05b0-4761-83f2-c195b6307bd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.182420 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.182658 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231eac71-05b0-4761-83f2-c195b6307bd9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.182717 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsj52\" (UniqueName: \"kubernetes.io/projected/231eac71-05b0-4761-83f2-c195b6307bd9-kube-api-access-lsj52\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.182795 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231eac71-05b0-4761-83f2-c195b6307bd9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.245180 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.258563 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.284420 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:39 crc kubenswrapper[4672]: E0930 12:41:39.284965 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-log" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.284989 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-log" Sep 30 12:41:39 crc kubenswrapper[4672]: E0930 12:41:39.285016 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-metadata" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.285023 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-metadata" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.285256 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-metadata" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.289429 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" containerName="nova-metadata-log" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.291534 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.295783 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.298593 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.298642 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.386235 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.386388 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-logs\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.386497 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.386514 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-config-data\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.386559 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t585n\" (UniqueName: \"kubernetes.io/projected/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-kube-api-access-t585n\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.433813 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231eac71-05b0-4761-83f2-c195b6307bd9" path="/var/lib/kubelet/pods/231eac71-05b0-4761-83f2-c195b6307bd9/volumes" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.488699 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.488783 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-logs\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.488873 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.488897 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-config-data\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.488936 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t585n\" (UniqueName: \"kubernetes.io/projected/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-kube-api-access-t585n\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.489624 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-logs\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.494896 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.494908 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.496220 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-config-data\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.511980 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t585n\" (UniqueName: \"kubernetes.io/projected/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-kube-api-access-t585n\") pod \"nova-metadata-0\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.616297 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.959193 4672 generic.go:334] "Generic (PLEG): container finished" podID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerID="8104e13c73285dd412b983216ceeb1293cdb470a134e07a7f30a5dcba8888c27" exitCode=137 Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.959505 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerDied","Data":"8104e13c73285dd412b983216ceeb1293cdb470a134e07a7f30a5dcba8888c27"} Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.959540 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34856d78-c924-4724-bbca-9f3b5ab6e9ad","Type":"ContainerDied","Data":"32dc50d0fe2239f2d7c406fde0dfa5b66afef338198ee17e3c096fdc83dcdd67"} Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.959554 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32dc50d0fe2239f2d7c406fde0dfa5b66afef338198ee17e3c096fdc83dcdd67" Sep 30 12:41:39 crc kubenswrapper[4672]: I0930 12:41:39.975747 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.103022 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-sg-core-conf-yaml\") pod \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.103407 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-combined-ca-bundle\") pod \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.103470 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-log-httpd\") pod \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.103533 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-config-data\") pod \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.103602 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-run-httpd\") pod \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.103687 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw5f6\" (UniqueName: \"kubernetes.io/projected/34856d78-c924-4724-bbca-9f3b5ab6e9ad-kube-api-access-gw5f6\") pod \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.103746 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-scripts\") pod \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\" (UID: \"34856d78-c924-4724-bbca-9f3b5ab6e9ad\") " Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.115060 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "34856d78-c924-4724-bbca-9f3b5ab6e9ad" (UID: "34856d78-c924-4724-bbca-9f3b5ab6e9ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.123916 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "34856d78-c924-4724-bbca-9f3b5ab6e9ad" (UID: "34856d78-c924-4724-bbca-9f3b5ab6e9ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.127666 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34856d78-c924-4724-bbca-9f3b5ab6e9ad-kube-api-access-gw5f6" (OuterVolumeSpecName: "kube-api-access-gw5f6") pod "34856d78-c924-4724-bbca-9f3b5ab6e9ad" (UID: "34856d78-c924-4724-bbca-9f3b5ab6e9ad"). InnerVolumeSpecName "kube-api-access-gw5f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.149430 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-scripts" (OuterVolumeSpecName: "scripts") pod "34856d78-c924-4724-bbca-9f3b5ab6e9ad" (UID: "34856d78-c924-4724-bbca-9f3b5ab6e9ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.188436 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "34856d78-c924-4724-bbca-9f3b5ab6e9ad" (UID: "34856d78-c924-4724-bbca-9f3b5ab6e9ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.207332 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.207379 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.207394 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.207406 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34856d78-c924-4724-bbca-9f3b5ab6e9ad-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.207417 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw5f6\" (UniqueName: \"kubernetes.io/projected/34856d78-c924-4724-bbca-9f3b5ab6e9ad-kube-api-access-gw5f6\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.244666 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34856d78-c924-4724-bbca-9f3b5ab6e9ad" (UID: "34856d78-c924-4724-bbca-9f3b5ab6e9ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.287163 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.309238 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.323641 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-config-data" (OuterVolumeSpecName: "config-data") pod "34856d78-c924-4724-bbca-9f3b5ab6e9ad" (UID: "34856d78-c924-4724-bbca-9f3b5ab6e9ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.411990 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34856d78-c924-4724-bbca-9f3b5ab6e9ad-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.992114 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3","Type":"ContainerStarted","Data":"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac"} Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.992442 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3","Type":"ContainerStarted","Data":"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b"} Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.992453 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3","Type":"ContainerStarted","Data":"a1980569a1c5f6f6be142e28dbd7c08e43267356f377cc4c50dba2c2d8d40060"} Sep 30 12:41:40 crc kubenswrapper[4672]: I0930 12:41:40.992155 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.053436 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.053414108 podStartE2EDuration="2.053414108s" podCreationTimestamp="2025-09-30 12:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:41.015981223 +0000 UTC m=+1192.285218889" watchObservedRunningTime="2025-09-30 12:41:41.053414108 +0000 UTC m=+1192.322651774" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.136403 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.146539 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.156405 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:41 crc kubenswrapper[4672]: E0930 12:41:41.156953 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-central-agent" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.156976 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-central-agent" Sep 30 12:41:41 crc kubenswrapper[4672]: E0930 12:41:41.157016 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-notification-agent" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.157024 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-notification-agent" Sep 30 12:41:41 crc kubenswrapper[4672]: E0930 12:41:41.157040 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="proxy-httpd" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.157048 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="proxy-httpd" Sep 30 12:41:41 crc kubenswrapper[4672]: E0930 12:41:41.157065 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="sg-core" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.157074 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="sg-core" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.157334 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="proxy-httpd" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.157364 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-central-agent" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.157385 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="sg-core" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.157402 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" containerName="ceilometer-notification-agent" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.160061 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.162653 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.163070 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.167165 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.234152 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-log-httpd\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.234245 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.234298 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.234345 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9x2t\" (UniqueName: \"kubernetes.io/projected/9fb1a180-8bbb-41ac-a730-9a491f508d81-kube-api-access-f9x2t\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.234394 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-run-httpd\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.234420 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-scripts\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.234442 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-config-data\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336212 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-log-httpd\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336568 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336593 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336639 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9x2t\" (UniqueName: \"kubernetes.io/projected/9fb1a180-8bbb-41ac-a730-9a491f508d81-kube-api-access-f9x2t\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336684 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-run-httpd\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336705 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-scripts\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336708 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-log-httpd\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.336723 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-config-data\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.337636 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-run-httpd\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.342065 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-scripts\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.342173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-config-data\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.342932 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.343547 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.362209 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9x2t\" (UniqueName: \"kubernetes.io/projected/9fb1a180-8bbb-41ac-a730-9a491f508d81-kube-api-access-f9x2t\") pod \"ceilometer-0\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.431724 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34856d78-c924-4724-bbca-9f3b5ab6e9ad" path="/var/lib/kubelet/pods/34856d78-c924-4724-bbca-9f3b5ab6e9ad/volumes" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.481485 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:41:41 crc kubenswrapper[4672]: I0930 12:41:41.954355 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.002161 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerStarted","Data":"37c84de619db694733ae4f4ae4ec2288c13e3a865fb0ec2e8fa29dc4feb3b364"} Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.099448 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.099519 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.422806 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.461164 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.461217 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.504794 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.520409 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.592601 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cd88f5d9f-4b79l"] Sep 30 12:41:42 crc kubenswrapper[4672]: I0930 12:41:42.593069 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" podUID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerName="dnsmasq-dns" containerID="cri-o://16ae50728b02ce642ba9cc52fa14ff05e2bf4e7d5c66837e67a183bf7e1ae30e" gracePeriod=10 Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.021462 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerStarted","Data":"5bf9b830470b30a39480da91d0973c1d82601b9757e3322063d56cf01581c295"} Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.022137 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerStarted","Data":"cdaf9886c5d1c9ffbfcfdf42d291c7908cf697c5dce1f93ccdd7c84f75629d52"} Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.024996 4672 generic.go:334] "Generic (PLEG): container finished" podID="645ec6e3-a10c-4651-9df2-a8259bcd51b9" containerID="31203d1a1f3ea529a7e5cf481e28fe0ad9c2807c8193e97f31a80a9df1538278" exitCode=0 Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.025099 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zmhvt" event={"ID":"645ec6e3-a10c-4651-9df2-a8259bcd51b9","Type":"ContainerDied","Data":"31203d1a1f3ea529a7e5cf481e28fe0ad9c2807c8193e97f31a80a9df1538278"} Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.034003 4672 generic.go:334] "Generic (PLEG): container finished" podID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerID="16ae50728b02ce642ba9cc52fa14ff05e2bf4e7d5c66837e67a183bf7e1ae30e" exitCode=0 Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.034513 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" event={"ID":"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7","Type":"ContainerDied","Data":"16ae50728b02ce642ba9cc52fa14ff05e2bf4e7d5c66837e67a183bf7e1ae30e"} Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.120749 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.182444 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.182696 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.287327 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.394715 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-svc\") pod \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.394786 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-nb\") pod \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.394814 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-sb\") pod \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.394855 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djd8z\" (UniqueName: \"kubernetes.io/projected/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-kube-api-access-djd8z\") pod \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.394953 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-config\") pod \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.395126 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-swift-storage-0\") pod \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\" (UID: \"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7\") " Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.401471 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-kube-api-access-djd8z" (OuterVolumeSpecName: "kube-api-access-djd8z") pod "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" (UID: "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7"). InnerVolumeSpecName "kube-api-access-djd8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.469130 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" (UID: "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.473645 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" (UID: "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.476863 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-config" (OuterVolumeSpecName: "config") pod "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" (UID: "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.482881 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" (UID: "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.490113 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" (UID: "794ccc31-97c8-40b7-b6ab-cc0d3a9946d7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.497523 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.497554 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.497564 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.497574 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.497583 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djd8z\" (UniqueName: \"kubernetes.io/projected/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-kube-api-access-djd8z\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:43 crc kubenswrapper[4672]: I0930 12:41:43.497593 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.083997 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerStarted","Data":"7795aad7551b72ab414af3bca9c6937329b40c1631a005a6db340d097567bbeb"} Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.092769 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" event={"ID":"794ccc31-97c8-40b7-b6ab-cc0d3a9946d7","Type":"ContainerDied","Data":"0c619d2604d71f142a2c83ca5f492252a7b5169f65add151279214d7ee4fe269"} Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.093013 4672 scope.go:117] "RemoveContainer" containerID="16ae50728b02ce642ba9cc52fa14ff05e2bf4e7d5c66837e67a183bf7e1ae30e" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.092814 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd88f5d9f-4b79l" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.153058 4672 scope.go:117] "RemoveContainer" containerID="c046827976c63bd11dc57bea69df9b0a26a7ec36ffc84fc6ef05711ae83c32d3" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.154002 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cd88f5d9f-4b79l"] Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.163663 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cd88f5d9f-4b79l"] Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.599542 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.616754 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.616806 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.735120 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-config-data\") pod \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.735242 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-combined-ca-bundle\") pod \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.735371 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45vhb\" (UniqueName: \"kubernetes.io/projected/645ec6e3-a10c-4651-9df2-a8259bcd51b9-kube-api-access-45vhb\") pod \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.735411 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-scripts\") pod \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\" (UID: \"645ec6e3-a10c-4651-9df2-a8259bcd51b9\") " Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.741156 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645ec6e3-a10c-4651-9df2-a8259bcd51b9-kube-api-access-45vhb" (OuterVolumeSpecName: "kube-api-access-45vhb") pod "645ec6e3-a10c-4651-9df2-a8259bcd51b9" (UID: "645ec6e3-a10c-4651-9df2-a8259bcd51b9"). InnerVolumeSpecName "kube-api-access-45vhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.745101 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-scripts" (OuterVolumeSpecName: "scripts") pod "645ec6e3-a10c-4651-9df2-a8259bcd51b9" (UID: "645ec6e3-a10c-4651-9df2-a8259bcd51b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.767387 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "645ec6e3-a10c-4651-9df2-a8259bcd51b9" (UID: "645ec6e3-a10c-4651-9df2-a8259bcd51b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.779559 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-config-data" (OuterVolumeSpecName: "config-data") pod "645ec6e3-a10c-4651-9df2-a8259bcd51b9" (UID: "645ec6e3-a10c-4651-9df2-a8259bcd51b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.838353 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.838395 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45vhb\" (UniqueName: \"kubernetes.io/projected/645ec6e3-a10c-4651-9df2-a8259bcd51b9-kube-api-access-45vhb\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.838414 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:44 crc kubenswrapper[4672]: I0930 12:41:44.838426 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/645ec6e3-a10c-4651-9df2-a8259bcd51b9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.100851 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zmhvt" event={"ID":"645ec6e3-a10c-4651-9df2-a8259bcd51b9","Type":"ContainerDied","Data":"0a50b1c9388b1ed01291d5b0835fe34f7e631c1d384c3e2468ec92e378ce7cf5"} Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.100885 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a50b1c9388b1ed01291d5b0835fe34f7e631c1d384c3e2468ec92e378ce7cf5" Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.100936 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zmhvt" Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.221958 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.222252 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-log" containerID="cri-o://7cb2155e7bc3424d896ede691e1ed23b1d989ff67288540ca0fa90e36da4f507" gracePeriod=30 Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.222322 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-api" containerID="cri-o://325505f1235b6cb8ebe38e2e6ec854dc0b69aece8402b899cbcfd1482667cd98" gracePeriod=30 Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.235210 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.235414 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="50605b9a-2e73-461e-9151-b47883bb9b9e" containerName="nova-scheduler-scheduler" containerID="cri-o://8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb" gracePeriod=30 Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.274250 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.274504 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-log" containerID="cri-o://97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b" gracePeriod=30 Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.274736 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-metadata" containerID="cri-o://1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac" gracePeriod=30 Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.428227 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" path="/var/lib/kubelet/pods/794ccc31-97c8-40b7-b6ab-cc0d3a9946d7/volumes" Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.863361 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.993581 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-config-data\") pod \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.994036 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-nova-metadata-tls-certs\") pod \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.994481 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t585n\" (UniqueName: \"kubernetes.io/projected/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-kube-api-access-t585n\") pod \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.994799 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-combined-ca-bundle\") pod \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.994896 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-logs\") pod \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\" (UID: \"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3\") " Sep 30 12:41:45 crc kubenswrapper[4672]: I0930 12:41:45.995582 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-logs" (OuterVolumeSpecName: "logs") pod "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" (UID: "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.004878 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-kube-api-access-t585n" (OuterVolumeSpecName: "kube-api-access-t585n") pod "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" (UID: "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3"). InnerVolumeSpecName "kube-api-access-t585n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.021610 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t585n\" (UniqueName: \"kubernetes.io/projected/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-kube-api-access-t585n\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.029109 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.029417 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-config-data" (OuterVolumeSpecName: "config-data") pod "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" (UID: "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.068727 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" (UID: "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.085421 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" (UID: "2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.122860 4672 generic.go:334] "Generic (PLEG): container finished" podID="e28fe302-599f-4788-a17f-78c90cfca670" containerID="7cb2155e7bc3424d896ede691e1ed23b1d989ff67288540ca0fa90e36da4f507" exitCode=143 Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.122933 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e28fe302-599f-4788-a17f-78c90cfca670","Type":"ContainerDied","Data":"7cb2155e7bc3424d896ede691e1ed23b1d989ff67288540ca0fa90e36da4f507"} Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.125860 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerStarted","Data":"419feab501e9fdb7ccf2ebe0ff91da151e26fdc849e6bf79bd4d66990d0843e2"} Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.127217 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.129668 4672 generic.go:334] "Generic (PLEG): container finished" podID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerID="1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac" exitCode=0 Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.129892 4672 generic.go:334] "Generic (PLEG): container finished" podID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerID="97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b" exitCode=143 Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.129746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3","Type":"ContainerDied","Data":"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac"} Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.130148 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3","Type":"ContainerDied","Data":"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b"} Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.130233 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3","Type":"ContainerDied","Data":"a1980569a1c5f6f6be142e28dbd7c08e43267356f377cc4c50dba2c2d8d40060"} Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.130357 4672 scope.go:117] "RemoveContainer" containerID="1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.129730 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.143755 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.144337 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.144637 4672 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.157862 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.567682371 podStartE2EDuration="5.157836834s" podCreationTimestamp="2025-09-30 12:41:41 +0000 UTC" firstStartedPulling="2025-09-30 12:41:41.963811964 +0000 UTC m=+1193.233049610" lastFinishedPulling="2025-09-30 12:41:45.553966427 +0000 UTC m=+1196.823204073" observedRunningTime="2025-09-30 12:41:46.148413726 +0000 UTC m=+1197.417651372" watchObservedRunningTime="2025-09-30 12:41:46.157836834 +0000 UTC m=+1197.427074480" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.331164 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.342683 4672 scope.go:117] "RemoveContainer" containerID="97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.349042 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.360672 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:46 crc kubenswrapper[4672]: E0930 12:41:46.361107 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerName="init" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361119 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerName="init" Sep 30 12:41:46 crc kubenswrapper[4672]: E0930 12:41:46.361137 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerName="dnsmasq-dns" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361143 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerName="dnsmasq-dns" Sep 30 12:41:46 crc kubenswrapper[4672]: E0930 12:41:46.361155 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645ec6e3-a10c-4651-9df2-a8259bcd51b9" containerName="nova-manage" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361164 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="645ec6e3-a10c-4651-9df2-a8259bcd51b9" containerName="nova-manage" Sep 30 12:41:46 crc kubenswrapper[4672]: E0930 12:41:46.361185 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-log" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361192 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-log" Sep 30 12:41:46 crc kubenswrapper[4672]: E0930 12:41:46.361208 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-metadata" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361215 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-metadata" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361441 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-log" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361456 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" containerName="nova-metadata-metadata" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361468 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ccc31-97c8-40b7-b6ab-cc0d3a9946d7" containerName="dnsmasq-dns" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.361488 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="645ec6e3-a10c-4651-9df2-a8259bcd51b9" containerName="nova-manage" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.362956 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.368823 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.369044 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.372284 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.385253 4672 scope.go:117] "RemoveContainer" containerID="1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac" Sep 30 12:41:46 crc kubenswrapper[4672]: E0930 12:41:46.385713 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac\": container with ID starting with 1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac not found: ID does not exist" containerID="1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.385748 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac"} err="failed to get container status \"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac\": rpc error: code = NotFound desc = could not find container \"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac\": container with ID starting with 1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac not found: ID does not exist" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.386015 4672 scope.go:117] "RemoveContainer" containerID="97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b" Sep 30 12:41:46 crc kubenswrapper[4672]: E0930 12:41:46.386257 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b\": container with ID starting with 97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b not found: ID does not exist" containerID="97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.386297 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b"} err="failed to get container status \"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b\": rpc error: code = NotFound desc = could not find container \"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b\": container with ID starting with 97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b not found: ID does not exist" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.386365 4672 scope.go:117] "RemoveContainer" containerID="1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.388159 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac"} err="failed to get container status \"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac\": rpc error: code = NotFound desc = could not find container \"1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac\": container with ID starting with 1bf64439775babb1b04baecb3667d7358f97f975671d9212190a30d1fc9e07ac not found: ID does not exist" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.388192 4672 scope.go:117] "RemoveContainer" containerID="97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.389421 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b"} err="failed to get container status \"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b\": rpc error: code = NotFound desc = could not find container \"97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b\": container with ID starting with 97a529e8b7a34310e10629cb6cdcafe9b253fe984b71b05a4d8566c0334f229b not found: ID does not exist" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.451797 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.451915 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.452001 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xvw\" (UniqueName: \"kubernetes.io/projected/a7d72ea6-c355-4e07-99da-88f9ff5cd342-kube-api-access-z7xvw\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.452090 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-config-data\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.452116 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d72ea6-c355-4e07-99da-88f9ff5cd342-logs\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.553930 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.554070 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xvw\" (UniqueName: \"kubernetes.io/projected/a7d72ea6-c355-4e07-99da-88f9ff5cd342-kube-api-access-z7xvw\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.554566 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-config-data\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.554600 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d72ea6-c355-4e07-99da-88f9ff5cd342-logs\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.554988 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.555213 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d72ea6-c355-4e07-99da-88f9ff5cd342-logs\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.559219 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.559749 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-config-data\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.560646 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.576606 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.580193 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xvw\" (UniqueName: \"kubernetes.io/projected/a7d72ea6-c355-4e07-99da-88f9ff5cd342-kube-api-access-z7xvw\") pod \"nova-metadata-0\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.656944 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-config-data\") pod \"50605b9a-2e73-461e-9151-b47883bb9b9e\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.656990 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-combined-ca-bundle\") pod \"50605b9a-2e73-461e-9151-b47883bb9b9e\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.657013 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96mgb\" (UniqueName: \"kubernetes.io/projected/50605b9a-2e73-461e-9151-b47883bb9b9e-kube-api-access-96mgb\") pod \"50605b9a-2e73-461e-9151-b47883bb9b9e\" (UID: \"50605b9a-2e73-461e-9151-b47883bb9b9e\") " Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.661741 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50605b9a-2e73-461e-9151-b47883bb9b9e-kube-api-access-96mgb" (OuterVolumeSpecName: "kube-api-access-96mgb") pod "50605b9a-2e73-461e-9151-b47883bb9b9e" (UID: "50605b9a-2e73-461e-9151-b47883bb9b9e"). InnerVolumeSpecName "kube-api-access-96mgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.691729 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.693361 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50605b9a-2e73-461e-9151-b47883bb9b9e" (UID: "50605b9a-2e73-461e-9151-b47883bb9b9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.707545 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-config-data" (OuterVolumeSpecName: "config-data") pod "50605b9a-2e73-461e-9151-b47883bb9b9e" (UID: "50605b9a-2e73-461e-9151-b47883bb9b9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.759839 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.759872 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50605b9a-2e73-461e-9151-b47883bb9b9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:46 crc kubenswrapper[4672]: I0930 12:41:46.759899 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96mgb\" (UniqueName: \"kubernetes.io/projected/50605b9a-2e73-461e-9151-b47883bb9b9e-kube-api-access-96mgb\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.143617 4672 generic.go:334] "Generic (PLEG): container finished" podID="50605b9a-2e73-461e-9151-b47883bb9b9e" containerID="8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb" exitCode=0 Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.143687 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50605b9a-2e73-461e-9151-b47883bb9b9e","Type":"ContainerDied","Data":"8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb"} Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.143717 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50605b9a-2e73-461e-9151-b47883bb9b9e","Type":"ContainerDied","Data":"1a995aa135f0fa05fd2376eb2c4f1bc50003d4e6aafc61f74f6dcdaaf05f1e92"} Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.143738 4672 scope.go:117] "RemoveContainer" containerID="8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.143881 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.218972 4672 scope.go:117] "RemoveContainer" containerID="8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb" Sep 30 12:41:47 crc kubenswrapper[4672]: E0930 12:41:47.220129 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb\": container with ID starting with 8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb not found: ID does not exist" containerID="8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.220249 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb"} err="failed to get container status \"8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb\": rpc error: code = NotFound desc = could not find container \"8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb\": container with ID starting with 8be8109889e6855683ea37174d27b8b0fc61cd5a63e1d256daefc33432a460bb not found: ID does not exist" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.227595 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.246213 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.257062 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.266303 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:47 crc kubenswrapper[4672]: E0930 12:41:47.266746 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50605b9a-2e73-461e-9151-b47883bb9b9e" containerName="nova-scheduler-scheduler" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.266809 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="50605b9a-2e73-461e-9151-b47883bb9b9e" containerName="nova-scheduler-scheduler" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.267205 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="50605b9a-2e73-461e-9151-b47883bb9b9e" containerName="nova-scheduler-scheduler" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.270454 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.274960 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.277718 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.378176 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-config-data\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.378225 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.378351 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpfbl\" (UniqueName: \"kubernetes.io/projected/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-kube-api-access-hpfbl\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.430578 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3" path="/var/lib/kubelet/pods/2a6d11ec-a673-45d7-8a3a-7fc3eb8dbdf3/volumes" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.431603 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50605b9a-2e73-461e-9151-b47883bb9b9e" path="/var/lib/kubelet/pods/50605b9a-2e73-461e-9151-b47883bb9b9e/volumes" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.480130 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-config-data\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.480185 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.480243 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpfbl\" (UniqueName: \"kubernetes.io/projected/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-kube-api-access-hpfbl\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.487277 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.487322 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-config-data\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.500934 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpfbl\" (UniqueName: \"kubernetes.io/projected/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-kube-api-access-hpfbl\") pod \"nova-scheduler-0\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " pod="openstack/nova-scheduler-0" Sep 30 12:41:47 crc kubenswrapper[4672]: I0930 12:41:47.597081 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.098250 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.170374 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7d72ea6-c355-4e07-99da-88f9ff5cd342","Type":"ContainerStarted","Data":"c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d"} Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.170421 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7d72ea6-c355-4e07-99da-88f9ff5cd342","Type":"ContainerStarted","Data":"cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a"} Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.170432 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7d72ea6-c355-4e07-99da-88f9ff5cd342","Type":"ContainerStarted","Data":"f3d73a1dbf607e16d377f77f9d48876eb71e451c9871e6779a271f8b5bc30b27"} Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.172766 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fbd3d6af-41ae-43fe-8d4b-10b98894fd99","Type":"ContainerStarted","Data":"d8a0480183553602603bb389a71d555b9cd94219c1a1197740f19834edd4203e"} Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.173947 4672 generic.go:334] "Generic (PLEG): container finished" podID="9858e9f6-3e7a-48e8-8557-eabc1ccfada4" containerID="8870f5bddbde13fa494016a4544314123d701c79bf980e4762681b821dc4225b" exitCode=0 Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.174867 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" event={"ID":"9858e9f6-3e7a-48e8-8557-eabc1ccfada4","Type":"ContainerDied","Data":"8870f5bddbde13fa494016a4544314123d701c79bf980e4762681b821dc4225b"} Sep 30 12:41:48 crc kubenswrapper[4672]: I0930 12:41:48.204398 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.204384474 podStartE2EDuration="2.204384474s" podCreationTimestamp="2025-09-30 12:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:48.198100625 +0000 UTC m=+1199.467338291" watchObservedRunningTime="2025-09-30 12:41:48.204384474 +0000 UTC m=+1199.473622120" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.190297 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fbd3d6af-41ae-43fe-8d4b-10b98894fd99","Type":"ContainerStarted","Data":"3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc"} Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.208227 4672 generic.go:334] "Generic (PLEG): container finished" podID="e28fe302-599f-4788-a17f-78c90cfca670" containerID="325505f1235b6cb8ebe38e2e6ec854dc0b69aece8402b899cbcfd1482667cd98" exitCode=0 Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.209760 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e28fe302-599f-4788-a17f-78c90cfca670","Type":"ContainerDied","Data":"325505f1235b6cb8ebe38e2e6ec854dc0b69aece8402b899cbcfd1482667cd98"} Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.226132 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.226107211 podStartE2EDuration="2.226107211s" podCreationTimestamp="2025-09-30 12:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:49.205236694 +0000 UTC m=+1200.474474380" watchObservedRunningTime="2025-09-30 12:41:49.226107211 +0000 UTC m=+1200.495344897" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.569605 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.629583 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28fe302-599f-4788-a17f-78c90cfca670-logs\") pod \"e28fe302-599f-4788-a17f-78c90cfca670\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.629705 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6btkf\" (UniqueName: \"kubernetes.io/projected/e28fe302-599f-4788-a17f-78c90cfca670-kube-api-access-6btkf\") pod \"e28fe302-599f-4788-a17f-78c90cfca670\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.629811 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-config-data\") pod \"e28fe302-599f-4788-a17f-78c90cfca670\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.629907 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-combined-ca-bundle\") pod \"e28fe302-599f-4788-a17f-78c90cfca670\" (UID: \"e28fe302-599f-4788-a17f-78c90cfca670\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.635906 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28fe302-599f-4788-a17f-78c90cfca670-logs" (OuterVolumeSpecName: "logs") pod "e28fe302-599f-4788-a17f-78c90cfca670" (UID: "e28fe302-599f-4788-a17f-78c90cfca670"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.645753 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28fe302-599f-4788-a17f-78c90cfca670-kube-api-access-6btkf" (OuterVolumeSpecName: "kube-api-access-6btkf") pod "e28fe302-599f-4788-a17f-78c90cfca670" (UID: "e28fe302-599f-4788-a17f-78c90cfca670"). InnerVolumeSpecName "kube-api-access-6btkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.669399 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-config-data" (OuterVolumeSpecName: "config-data") pod "e28fe302-599f-4788-a17f-78c90cfca670" (UID: "e28fe302-599f-4788-a17f-78c90cfca670"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.686594 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e28fe302-599f-4788-a17f-78c90cfca670" (UID: "e28fe302-599f-4788-a17f-78c90cfca670"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.723129 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.732812 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.732853 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e28fe302-599f-4788-a17f-78c90cfca670-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.732866 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6btkf\" (UniqueName: \"kubernetes.io/projected/e28fe302-599f-4788-a17f-78c90cfca670-kube-api-access-6btkf\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.732880 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e28fe302-599f-4788-a17f-78c90cfca670-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.834436 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-combined-ca-bundle\") pod \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.834596 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-config-data\") pod \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.834671 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxnxj\" (UniqueName: \"kubernetes.io/projected/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-kube-api-access-dxnxj\") pod \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.834743 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-scripts\") pod \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\" (UID: \"9858e9f6-3e7a-48e8-8557-eabc1ccfada4\") " Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.840037 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-kube-api-access-dxnxj" (OuterVolumeSpecName: "kube-api-access-dxnxj") pod "9858e9f6-3e7a-48e8-8557-eabc1ccfada4" (UID: "9858e9f6-3e7a-48e8-8557-eabc1ccfada4"). InnerVolumeSpecName "kube-api-access-dxnxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.840241 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-scripts" (OuterVolumeSpecName: "scripts") pod "9858e9f6-3e7a-48e8-8557-eabc1ccfada4" (UID: "9858e9f6-3e7a-48e8-8557-eabc1ccfada4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.864507 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9858e9f6-3e7a-48e8-8557-eabc1ccfada4" (UID: "9858e9f6-3e7a-48e8-8557-eabc1ccfada4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.868219 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-config-data" (OuterVolumeSpecName: "config-data") pod "9858e9f6-3e7a-48e8-8557-eabc1ccfada4" (UID: "9858e9f6-3e7a-48e8-8557-eabc1ccfada4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.936822 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.936864 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.936881 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxnxj\" (UniqueName: \"kubernetes.io/projected/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-kube-api-access-dxnxj\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:49 crc kubenswrapper[4672]: I0930 12:41:49.936894 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9858e9f6-3e7a-48e8-8557-eabc1ccfada4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.225250 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.225238 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8hmsx" event={"ID":"9858e9f6-3e7a-48e8-8557-eabc1ccfada4","Type":"ContainerDied","Data":"0b2e868a61283fcabc0888a0f8baa7796e9d596441029c23e51dda52ca3a79f7"} Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.225434 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b2e868a61283fcabc0888a0f8baa7796e9d596441029c23e51dda52ca3a79f7" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.227641 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e28fe302-599f-4788-a17f-78c90cfca670","Type":"ContainerDied","Data":"9491639a9260cff247244696b45ade581f4040d89bd2baf96ca190ea4c867e9f"} Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.227694 4672 scope.go:117] "RemoveContainer" containerID="325505f1235b6cb8ebe38e2e6ec854dc0b69aece8402b899cbcfd1482667cd98" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.227883 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.314289 4672 scope.go:117] "RemoveContainer" containerID="7cb2155e7bc3424d896ede691e1ed23b1d989ff67288540ca0fa90e36da4f507" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.316446 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 12:41:50 crc kubenswrapper[4672]: E0930 12:41:50.316861 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-api" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.316877 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-api" Sep 30 12:41:50 crc kubenswrapper[4672]: E0930 12:41:50.316904 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9858e9f6-3e7a-48e8-8557-eabc1ccfada4" containerName="nova-cell1-conductor-db-sync" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.316911 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9858e9f6-3e7a-48e8-8557-eabc1ccfada4" containerName="nova-cell1-conductor-db-sync" Sep 30 12:41:50 crc kubenswrapper[4672]: E0930 12:41:50.316940 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-log" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.316946 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-log" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.317133 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9858e9f6-3e7a-48e8-8557-eabc1ccfada4" containerName="nova-cell1-conductor-db-sync" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.317152 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-api" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.317169 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28fe302-599f-4788-a17f-78c90cfca670" containerName="nova-api-log" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.317873 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.338910 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.340461 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.357818 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.362995 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.367981 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.371665 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.374001 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.401539 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.450406 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm899\" (UniqueName: \"kubernetes.io/projected/5017308f-acf6-406c-8c75-3f6b550f8190-kube-api-access-wm899\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.450472 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppm2c\" (UniqueName: \"kubernetes.io/projected/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-kube-api-access-ppm2c\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.450551 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.450646 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-config-data\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.450674 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5017308f-acf6-406c-8c75-3f6b550f8190-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.450699 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5017308f-acf6-406c-8c75-3f6b550f8190-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.450743 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-logs\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.553011 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm899\" (UniqueName: \"kubernetes.io/projected/5017308f-acf6-406c-8c75-3f6b550f8190-kube-api-access-wm899\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.553114 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppm2c\" (UniqueName: \"kubernetes.io/projected/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-kube-api-access-ppm2c\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.553210 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.553294 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-config-data\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.553677 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5017308f-acf6-406c-8c75-3f6b550f8190-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.553718 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5017308f-acf6-406c-8c75-3f6b550f8190-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.553792 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-logs\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.556459 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-logs\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.558349 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-config-data\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.559311 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5017308f-acf6-406c-8c75-3f6b550f8190-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.566453 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5017308f-acf6-406c-8c75-3f6b550f8190-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.569703 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.572008 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppm2c\" (UniqueName: \"kubernetes.io/projected/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-kube-api-access-ppm2c\") pod \"nova-api-0\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " pod="openstack/nova-api-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.577548 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm899\" (UniqueName: \"kubernetes.io/projected/5017308f-acf6-406c-8c75-3f6b550f8190-kube-api-access-wm899\") pod \"nova-cell1-conductor-0\" (UID: \"5017308f-acf6-406c-8c75-3f6b550f8190\") " pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.651091 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:50 crc kubenswrapper[4672]: I0930 12:41:50.701843 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:41:51 crc kubenswrapper[4672]: I0930 12:41:51.135556 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 12:41:51 crc kubenswrapper[4672]: I0930 12:41:51.224193 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:41:51 crc kubenswrapper[4672]: I0930 12:41:51.256145 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5017308f-acf6-406c-8c75-3f6b550f8190","Type":"ContainerStarted","Data":"c8d817aa9c013efae633e226a0b4952b9ffb8e3549fbefcf2c9feb9bb141f4cc"} Sep 30 12:41:51 crc kubenswrapper[4672]: I0930 12:41:51.440177 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e28fe302-599f-4788-a17f-78c90cfca670" path="/var/lib/kubelet/pods/e28fe302-599f-4788-a17f-78c90cfca670/volumes" Sep 30 12:41:51 crc kubenswrapper[4672]: I0930 12:41:51.692780 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 12:41:51 crc kubenswrapper[4672]: I0930 12:41:51.692849 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.274997 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5017308f-acf6-406c-8c75-3f6b550f8190","Type":"ContainerStarted","Data":"f44bea9635e0538ce41fe716c685e3f4905cfde598ec7e0473f472a1ac69ffe9"} Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.275111 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.276450 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2","Type":"ContainerStarted","Data":"0a804eb630446c62f6f3a6aa97da1e1c639984140d374d30fc951165ee3ed56f"} Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.276476 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2","Type":"ContainerStarted","Data":"d65d62c2763af66b3ade4ee5ff31c4d209f29c59eaa3edf361c312e723e283cb"} Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.276490 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2","Type":"ContainerStarted","Data":"2179940a592de9ba758dd137f9c8f65d96296dc7ed358b71ab6e32fb2111db22"} Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.296610 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.2965882029999998 podStartE2EDuration="2.296588203s" podCreationTimestamp="2025-09-30 12:41:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:52.287092013 +0000 UTC m=+1203.556329679" watchObservedRunningTime="2025-09-30 12:41:52.296588203 +0000 UTC m=+1203.565825859" Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.304523 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.304509613 podStartE2EDuration="2.304509613s" podCreationTimestamp="2025-09-30 12:41:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:41:52.303439746 +0000 UTC m=+1203.572677412" watchObservedRunningTime="2025-09-30 12:41:52.304509613 +0000 UTC m=+1203.573747269" Sep 30 12:41:52 crc kubenswrapper[4672]: I0930 12:41:52.601318 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 12:41:54 crc kubenswrapper[4672]: I0930 12:41:54.739694 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:41:54 crc kubenswrapper[4672]: I0930 12:41:54.741352 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:41:56 crc kubenswrapper[4672]: I0930 12:41:56.692815 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 12:41:56 crc kubenswrapper[4672]: I0930 12:41:56.693194 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 12:41:57 crc kubenswrapper[4672]: I0930 12:41:57.598009 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 12:41:57 crc kubenswrapper[4672]: I0930 12:41:57.712564 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:41:57 crc kubenswrapper[4672]: I0930 12:41:57.712630 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:41:57 crc kubenswrapper[4672]: I0930 12:41:57.731662 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 12:41:58 crc kubenswrapper[4672]: I0930 12:41:58.368313 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 12:42:00 crc kubenswrapper[4672]: I0930 12:42:00.693636 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 12:42:00 crc kubenswrapper[4672]: I0930 12:42:00.703098 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:42:00 crc kubenswrapper[4672]: I0930 12:42:00.703553 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:42:01 crc kubenswrapper[4672]: I0930 12:42:01.785566 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:01 crc kubenswrapper[4672]: I0930 12:42:01.785588 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:06 crc kubenswrapper[4672]: I0930 12:42:06.702974 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 12:42:06 crc kubenswrapper[4672]: I0930 12:42:06.703887 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 12:42:06 crc kubenswrapper[4672]: I0930 12:42:06.714360 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 12:42:06 crc kubenswrapper[4672]: I0930 12:42:06.715508 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.319019 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.439916 4672 generic.go:334] "Generic (PLEG): container finished" podID="fba6b3f9-5728-4c5b-955a-571d3a8c83f4" containerID="4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769" exitCode=137 Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.440014 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fba6b3f9-5728-4c5b-955a-571d3a8c83f4","Type":"ContainerDied","Data":"4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769"} Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.440051 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fba6b3f9-5728-4c5b-955a-571d3a8c83f4","Type":"ContainerDied","Data":"b01598d6f664d7bd5113a90aa5af994815d5021c3db250fe3f3b24ec3add0d9f"} Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.440068 4672 scope.go:117] "RemoveContainer" containerID="4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.440113 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.445084 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-combined-ca-bundle\") pod \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.445279 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-config-data\") pod \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.445317 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnjlj\" (UniqueName: \"kubernetes.io/projected/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-kube-api-access-dnjlj\") pod \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\" (UID: \"fba6b3f9-5728-4c5b-955a-571d3a8c83f4\") " Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.452161 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-kube-api-access-dnjlj" (OuterVolumeSpecName: "kube-api-access-dnjlj") pod "fba6b3f9-5728-4c5b-955a-571d3a8c83f4" (UID: "fba6b3f9-5728-4c5b-955a-571d3a8c83f4"). InnerVolumeSpecName "kube-api-access-dnjlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.462988 4672 scope.go:117] "RemoveContainer" containerID="4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769" Sep 30 12:42:08 crc kubenswrapper[4672]: E0930 12:42:08.464362 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769\": container with ID starting with 4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769 not found: ID does not exist" containerID="4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.464408 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769"} err="failed to get container status \"4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769\": rpc error: code = NotFound desc = could not find container \"4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769\": container with ID starting with 4170c2e5656526544e148c18c3748b58cc1d0434b43762809d7837636a529769 not found: ID does not exist" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.475684 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fba6b3f9-5728-4c5b-955a-571d3a8c83f4" (UID: "fba6b3f9-5728-4c5b-955a-571d3a8c83f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.479408 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-config-data" (OuterVolumeSpecName: "config-data") pod "fba6b3f9-5728-4c5b-955a-571d3a8c83f4" (UID: "fba6b3f9-5728-4c5b-955a-571d3a8c83f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.548411 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.548458 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnjlj\" (UniqueName: \"kubernetes.io/projected/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-kube-api-access-dnjlj\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.548474 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba6b3f9-5728-4c5b-955a-571d3a8c83f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.798315 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.810170 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.836816 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:42:08 crc kubenswrapper[4672]: E0930 12:42:08.837597 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba6b3f9-5728-4c5b-955a-571d3a8c83f4" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.837704 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba6b3f9-5728-4c5b-955a-571d3a8c83f4" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.838086 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba6b3f9-5728-4c5b-955a-571d3a8c83f4" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.838995 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.844078 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.844955 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.846654 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.852170 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.956060 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.956535 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.956792 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qf94\" (UniqueName: \"kubernetes.io/projected/92487d22-391c-44e2-8179-1e523ab07026-kube-api-access-2qf94\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.956921 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:08 crc kubenswrapper[4672]: I0930 12:42:08.957009 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.059251 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.059325 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.059484 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.059553 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qf94\" (UniqueName: \"kubernetes.io/projected/92487d22-391c-44e2-8179-1e523ab07026-kube-api-access-2qf94\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.059597 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.063345 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.064984 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.066168 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.070882 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92487d22-391c-44e2-8179-1e523ab07026-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.081475 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qf94\" (UniqueName: \"kubernetes.io/projected/92487d22-391c-44e2-8179-1e523ab07026-kube-api-access-2qf94\") pod \"nova-cell1-novncproxy-0\" (UID: \"92487d22-391c-44e2-8179-1e523ab07026\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.176458 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.431566 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba6b3f9-5728-4c5b-955a-571d3a8c83f4" path="/var/lib/kubelet/pods/fba6b3f9-5728-4c5b-955a-571d3a8c83f4/volumes" Sep 30 12:42:09 crc kubenswrapper[4672]: W0930 12:42:09.631024 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92487d22_391c_44e2_8179_1e523ab07026.slice/crio-3834a3ef708b5e8848855780cfe57818921c996a419ed64009be94630e4b3aa0 WatchSource:0}: Error finding container 3834a3ef708b5e8848855780cfe57818921c996a419ed64009be94630e4b3aa0: Status 404 returned error can't find the container with id 3834a3ef708b5e8848855780cfe57818921c996a419ed64009be94630e4b3aa0 Sep 30 12:42:09 crc kubenswrapper[4672]: I0930 12:42:09.646661 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 12:42:10 crc kubenswrapper[4672]: I0930 12:42:10.464849 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"92487d22-391c-44e2-8179-1e523ab07026","Type":"ContainerStarted","Data":"bb84987cbcd90f345ee7c5abc116ac871145b1358be5f336295a0d8bf8d291b3"} Sep 30 12:42:10 crc kubenswrapper[4672]: I0930 12:42:10.465170 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"92487d22-391c-44e2-8179-1e523ab07026","Type":"ContainerStarted","Data":"3834a3ef708b5e8848855780cfe57818921c996a419ed64009be94630e4b3aa0"} Sep 30 12:42:10 crc kubenswrapper[4672]: I0930 12:42:10.489242 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.489223496 podStartE2EDuration="2.489223496s" podCreationTimestamp="2025-09-30 12:42:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:42:10.487785929 +0000 UTC m=+1221.757023575" watchObservedRunningTime="2025-09-30 12:42:10.489223496 +0000 UTC m=+1221.758461152" Sep 30 12:42:10 crc kubenswrapper[4672]: I0930 12:42:10.708836 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 12:42:10 crc kubenswrapper[4672]: I0930 12:42:10.709530 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 12:42:10 crc kubenswrapper[4672]: I0930 12:42:10.709664 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 12:42:10 crc kubenswrapper[4672]: I0930 12:42:10.715670 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.477458 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.483913 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.493992 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.699643 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b9c78747-bt7cg"] Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.701825 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.731337 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b9c78747-bt7cg"] Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.814279 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-sb\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.814560 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x42qk\" (UniqueName: \"kubernetes.io/projected/cbc1011f-55f1-4518-9117-215b69ae7590-kube-api-access-x42qk\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.814657 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-nb\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.814692 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-svc\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.814775 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-config\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.814935 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-swift-storage-0\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.917026 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-sb\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.917172 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x42qk\" (UniqueName: \"kubernetes.io/projected/cbc1011f-55f1-4518-9117-215b69ae7590-kube-api-access-x42qk\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.917208 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-nb\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.917231 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-svc\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.917284 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-config\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.917344 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-swift-storage-0\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.918119 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-sb\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.918142 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-svc\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.918213 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-nb\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.918424 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-config\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.918584 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-swift-storage-0\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:11 crc kubenswrapper[4672]: I0930 12:42:11.944524 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x42qk\" (UniqueName: \"kubernetes.io/projected/cbc1011f-55f1-4518-9117-215b69ae7590-kube-api-access-x42qk\") pod \"dnsmasq-dns-8b9c78747-bt7cg\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:12 crc kubenswrapper[4672]: I0930 12:42:12.023868 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:12 crc kubenswrapper[4672]: I0930 12:42:12.540200 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b9c78747-bt7cg"] Sep 30 12:42:12 crc kubenswrapper[4672]: W0930 12:42:12.542237 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbc1011f_55f1_4518_9117_215b69ae7590.slice/crio-ea5341280da5e524d64656d359a2700f361c6819b79d45b2e0e7aa41468d6b4f WatchSource:0}: Error finding container ea5341280da5e524d64656d359a2700f361c6819b79d45b2e0e7aa41468d6b4f: Status 404 returned error can't find the container with id ea5341280da5e524d64656d359a2700f361c6819b79d45b2e0e7aa41468d6b4f Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.498020 4672 generic.go:334] "Generic (PLEG): container finished" podID="cbc1011f-55f1-4518-9117-215b69ae7590" containerID="af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c" exitCode=0 Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.498122 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" event={"ID":"cbc1011f-55f1-4518-9117-215b69ae7590","Type":"ContainerDied","Data":"af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c"} Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.498622 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" event={"ID":"cbc1011f-55f1-4518-9117-215b69ae7590","Type":"ContainerStarted","Data":"ea5341280da5e524d64656d359a2700f361c6819b79d45b2e0e7aa41468d6b4f"} Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.725433 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.725732 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-central-agent" containerID="cri-o://cdaf9886c5d1c9ffbfcfdf42d291c7908cf697c5dce1f93ccdd7c84f75629d52" gracePeriod=30 Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.725797 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="sg-core" containerID="cri-o://7795aad7551b72ab414af3bca9c6937329b40c1631a005a6db340d097567bbeb" gracePeriod=30 Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.725791 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="proxy-httpd" containerID="cri-o://419feab501e9fdb7ccf2ebe0ff91da151e26fdc849e6bf79bd4d66990d0843e2" gracePeriod=30 Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.725889 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-notification-agent" containerID="cri-o://5bf9b830470b30a39480da91d0973c1d82601b9757e3322063d56cf01581c295" gracePeriod=30 Sep 30 12:42:13 crc kubenswrapper[4672]: I0930 12:42:13.884991 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.176733 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.511423 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" event={"ID":"cbc1011f-55f1-4518-9117-215b69ae7590","Type":"ContainerStarted","Data":"d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6"} Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.511602 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.514786 4672 generic.go:334] "Generic (PLEG): container finished" podID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerID="419feab501e9fdb7ccf2ebe0ff91da151e26fdc849e6bf79bd4d66990d0843e2" exitCode=0 Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.514815 4672 generic.go:334] "Generic (PLEG): container finished" podID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerID="7795aad7551b72ab414af3bca9c6937329b40c1631a005a6db340d097567bbeb" exitCode=2 Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.514824 4672 generic.go:334] "Generic (PLEG): container finished" podID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerID="cdaf9886c5d1c9ffbfcfdf42d291c7908cf697c5dce1f93ccdd7c84f75629d52" exitCode=0 Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.514821 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerDied","Data":"419feab501e9fdb7ccf2ebe0ff91da151e26fdc849e6bf79bd4d66990d0843e2"} Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.514857 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerDied","Data":"7795aad7551b72ab414af3bca9c6937329b40c1631a005a6db340d097567bbeb"} Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.514874 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerDied","Data":"cdaf9886c5d1c9ffbfcfdf42d291c7908cf697c5dce1f93ccdd7c84f75629d52"} Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.515022 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-log" containerID="cri-o://d65d62c2763af66b3ade4ee5ff31c4d209f29c59eaa3edf361c312e723e283cb" gracePeriod=30 Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.515113 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-api" containerID="cri-o://0a804eb630446c62f6f3a6aa97da1e1c639984140d374d30fc951165ee3ed56f" gracePeriod=30 Sep 30 12:42:14 crc kubenswrapper[4672]: I0930 12:42:14.540151 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" podStartSLOduration=3.540125472 podStartE2EDuration="3.540125472s" podCreationTimestamp="2025-09-30 12:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:42:14.53173389 +0000 UTC m=+1225.800971536" watchObservedRunningTime="2025-09-30 12:42:14.540125472 +0000 UTC m=+1225.809363118" Sep 30 12:42:15 crc kubenswrapper[4672]: I0930 12:42:15.532211 4672 generic.go:334] "Generic (PLEG): container finished" podID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerID="d65d62c2763af66b3ade4ee5ff31c4d209f29c59eaa3edf361c312e723e283cb" exitCode=143 Sep 30 12:42:15 crc kubenswrapper[4672]: I0930 12:42:15.532501 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2","Type":"ContainerDied","Data":"d65d62c2763af66b3ade4ee5ff31c4d209f29c59eaa3edf361c312e723e283cb"} Sep 30 12:42:15 crc kubenswrapper[4672]: I0930 12:42:15.539513 4672 generic.go:334] "Generic (PLEG): container finished" podID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerID="5bf9b830470b30a39480da91d0973c1d82601b9757e3322063d56cf01581c295" exitCode=0 Sep 30 12:42:15 crc kubenswrapper[4672]: I0930 12:42:15.540467 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerDied","Data":"5bf9b830470b30a39480da91d0973c1d82601b9757e3322063d56cf01581c295"} Sep 30 12:42:15 crc kubenswrapper[4672]: I0930 12:42:15.875107 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.025303 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-config-data\") pod \"9fb1a180-8bbb-41ac-a730-9a491f508d81\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026091 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-combined-ca-bundle\") pod \"9fb1a180-8bbb-41ac-a730-9a491f508d81\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026156 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-log-httpd\") pod \"9fb1a180-8bbb-41ac-a730-9a491f508d81\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026253 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-run-httpd\") pod \"9fb1a180-8bbb-41ac-a730-9a491f508d81\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026299 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-scripts\") pod \"9fb1a180-8bbb-41ac-a730-9a491f508d81\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026372 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-sg-core-conf-yaml\") pod \"9fb1a180-8bbb-41ac-a730-9a491f508d81\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026486 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9x2t\" (UniqueName: \"kubernetes.io/projected/9fb1a180-8bbb-41ac-a730-9a491f508d81-kube-api-access-f9x2t\") pod \"9fb1a180-8bbb-41ac-a730-9a491f508d81\" (UID: \"9fb1a180-8bbb-41ac-a730-9a491f508d81\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026620 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9fb1a180-8bbb-41ac-a730-9a491f508d81" (UID: "9fb1a180-8bbb-41ac-a730-9a491f508d81"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.026700 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9fb1a180-8bbb-41ac-a730-9a491f508d81" (UID: "9fb1a180-8bbb-41ac-a730-9a491f508d81"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.027043 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.027066 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fb1a180-8bbb-41ac-a730-9a491f508d81-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.030771 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-scripts" (OuterVolumeSpecName: "scripts") pod "9fb1a180-8bbb-41ac-a730-9a491f508d81" (UID: "9fb1a180-8bbb-41ac-a730-9a491f508d81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.031158 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb1a180-8bbb-41ac-a730-9a491f508d81-kube-api-access-f9x2t" (OuterVolumeSpecName: "kube-api-access-f9x2t") pod "9fb1a180-8bbb-41ac-a730-9a491f508d81" (UID: "9fb1a180-8bbb-41ac-a730-9a491f508d81"). InnerVolumeSpecName "kube-api-access-f9x2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.062780 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9fb1a180-8bbb-41ac-a730-9a491f508d81" (UID: "9fb1a180-8bbb-41ac-a730-9a491f508d81"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.107052 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fb1a180-8bbb-41ac-a730-9a491f508d81" (UID: "9fb1a180-8bbb-41ac-a730-9a491f508d81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.128851 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.128889 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.128903 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.128914 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9x2t\" (UniqueName: \"kubernetes.io/projected/9fb1a180-8bbb-41ac-a730-9a491f508d81-kube-api-access-f9x2t\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.136202 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-config-data" (OuterVolumeSpecName: "config-data") pod "9fb1a180-8bbb-41ac-a730-9a491f508d81" (UID: "9fb1a180-8bbb-41ac-a730-9a491f508d81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.230574 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb1a180-8bbb-41ac-a730-9a491f508d81-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.551357 4672 generic.go:334] "Generic (PLEG): container finished" podID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerID="0a804eb630446c62f6f3a6aa97da1e1c639984140d374d30fc951165ee3ed56f" exitCode=0 Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.551386 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2","Type":"ContainerDied","Data":"0a804eb630446c62f6f3a6aa97da1e1c639984140d374d30fc951165ee3ed56f"} Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.551726 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2","Type":"ContainerDied","Data":"2179940a592de9ba758dd137f9c8f65d96296dc7ed358b71ab6e32fb2111db22"} Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.551771 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2179940a592de9ba758dd137f9c8f65d96296dc7ed358b71ab6e32fb2111db22" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.554562 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fb1a180-8bbb-41ac-a730-9a491f508d81","Type":"ContainerDied","Data":"37c84de619db694733ae4f4ae4ec2288c13e3a865fb0ec2e8fa29dc4feb3b364"} Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.554611 4672 scope.go:117] "RemoveContainer" containerID="419feab501e9fdb7ccf2ebe0ff91da151e26fdc849e6bf79bd4d66990d0843e2" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.554655 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.658250 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.671710 4672 scope.go:117] "RemoveContainer" containerID="7795aad7551b72ab414af3bca9c6937329b40c1631a005a6db340d097567bbeb" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.687427 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.713181 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.719594 4672 scope.go:117] "RemoveContainer" containerID="5bf9b830470b30a39480da91d0973c1d82601b9757e3322063d56cf01581c295" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.729213 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:16 crc kubenswrapper[4672]: E0930 12:42:16.731684 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="proxy-httpd" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.731707 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="proxy-httpd" Sep 30 12:42:16 crc kubenswrapper[4672]: E0930 12:42:16.731722 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="sg-core" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.731730 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="sg-core" Sep 30 12:42:16 crc kubenswrapper[4672]: E0930 12:42:16.731744 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-notification-agent" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.731753 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-notification-agent" Sep 30 12:42:16 crc kubenswrapper[4672]: E0930 12:42:16.731764 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-api" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.731770 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-api" Sep 30 12:42:16 crc kubenswrapper[4672]: E0930 12:42:16.731782 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-log" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.731789 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-log" Sep 30 12:42:16 crc kubenswrapper[4672]: E0930 12:42:16.731814 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-central-agent" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.731821 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-central-agent" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.732018 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="sg-core" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.732029 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-log" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.732042 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" containerName="nova-api-api" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.732049 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="proxy-httpd" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.732058 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-notification-agent" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.732066 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" containerName="ceilometer-central-agent" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.734909 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.737195 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.739628 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.742857 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-combined-ca-bundle\") pod \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.742997 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-config-data\") pod \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.743198 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-logs\") pod \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.743404 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppm2c\" (UniqueName: \"kubernetes.io/projected/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-kube-api-access-ppm2c\") pod \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\" (UID: \"d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2\") " Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.744243 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-logs" (OuterVolumeSpecName: "logs") pod "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" (UID: "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.757167 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.768407 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-kube-api-access-ppm2c" (OuterVolumeSpecName: "kube-api-access-ppm2c") pod "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" (UID: "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2"). InnerVolumeSpecName "kube-api-access-ppm2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.772994 4672 scope.go:117] "RemoveContainer" containerID="cdaf9886c5d1c9ffbfcfdf42d291c7908cf697c5dce1f93ccdd7c84f75629d52" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.784478 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-config-data" (OuterVolumeSpecName: "config-data") pod "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" (UID: "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.827494 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" (UID: "d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.846954 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgm7c\" (UniqueName: \"kubernetes.io/projected/4301988f-bce9-466c-aff1-88d33d37a9cd-kube-api-access-tgm7c\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.851557 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-log-httpd\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.851673 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-scripts\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.851721 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-config-data\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.851774 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-run-httpd\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.851932 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.852190 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.852362 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.852378 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.852389 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppm2c\" (UniqueName: \"kubernetes.io/projected/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-kube-api-access-ppm2c\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.852400 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.954384 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.955097 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgm7c\" (UniqueName: \"kubernetes.io/projected/4301988f-bce9-466c-aff1-88d33d37a9cd-kube-api-access-tgm7c\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.955148 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-log-httpd\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.955182 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-scripts\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.955212 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-config-data\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.955245 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-run-httpd\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.955339 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.955987 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-log-httpd\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.956432 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-run-httpd\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.959746 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-scripts\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.959967 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.960131 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-config-data\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.960340 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:16 crc kubenswrapper[4672]: I0930 12:42:16.975897 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgm7c\" (UniqueName: \"kubernetes.io/projected/4301988f-bce9-466c-aff1-88d33d37a9cd-kube-api-access-tgm7c\") pod \"ceilometer-0\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " pod="openstack/ceilometer-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.164876 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.434141 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb1a180-8bbb-41ac-a730-9a491f508d81" path="/var/lib/kubelet/pods/9fb1a180-8bbb-41ac-a730-9a491f508d81/volumes" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.567604 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.619692 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.658977 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:17 crc kubenswrapper[4672]: W0930 12:42:17.668405 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4301988f_bce9_466c_aff1_88d33d37a9cd.slice/crio-a284d04037fc82b693d6cf7811dc2baa7ee7acf23c85bc38fc4e0521c7872e59 WatchSource:0}: Error finding container a284d04037fc82b693d6cf7811dc2baa7ee7acf23c85bc38fc4e0521c7872e59: Status 404 returned error can't find the container with id a284d04037fc82b693d6cf7811dc2baa7ee7acf23c85bc38fc4e0521c7872e59 Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.673659 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.681840 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.686835 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.690041 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.691040 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.693108 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.696787 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.700134 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.769748 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555aaf1f-ce58-4bf6-bade-9424a52fda72-logs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.769994 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-config-data\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.770205 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-public-tls-certs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.770358 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lndz\" (UniqueName: \"kubernetes.io/projected/555aaf1f-ce58-4bf6-bade-9424a52fda72-kube-api-access-2lndz\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.770398 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.770503 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-internal-tls-certs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.871941 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-config-data\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.872052 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-public-tls-certs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.872137 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lndz\" (UniqueName: \"kubernetes.io/projected/555aaf1f-ce58-4bf6-bade-9424a52fda72-kube-api-access-2lndz\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.872165 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.872224 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-internal-tls-certs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.872247 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555aaf1f-ce58-4bf6-bade-9424a52fda72-logs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.873150 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555aaf1f-ce58-4bf6-bade-9424a52fda72-logs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.876583 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-public-tls-certs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.876941 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-config-data\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.878370 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.879325 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-internal-tls-certs\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:17 crc kubenswrapper[4672]: I0930 12:42:17.892737 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lndz\" (UniqueName: \"kubernetes.io/projected/555aaf1f-ce58-4bf6-bade-9424a52fda72-kube-api-access-2lndz\") pod \"nova-api-0\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " pod="openstack/nova-api-0" Sep 30 12:42:18 crc kubenswrapper[4672]: I0930 12:42:18.032241 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:18 crc kubenswrapper[4672]: W0930 12:42:18.493749 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod555aaf1f_ce58_4bf6_bade_9424a52fda72.slice/crio-8a0df1249ffb21d85aa536501281b9cd6915e73a2e4dced43a5f54007a1b882f WatchSource:0}: Error finding container 8a0df1249ffb21d85aa536501281b9cd6915e73a2e4dced43a5f54007a1b882f: Status 404 returned error can't find the container with id 8a0df1249ffb21d85aa536501281b9cd6915e73a2e4dced43a5f54007a1b882f Sep 30 12:42:18 crc kubenswrapper[4672]: I0930 12:42:18.494371 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:18 crc kubenswrapper[4672]: I0930 12:42:18.586336 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerStarted","Data":"f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d"} Sep 30 12:42:18 crc kubenswrapper[4672]: I0930 12:42:18.586392 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerStarted","Data":"4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6"} Sep 30 12:42:18 crc kubenswrapper[4672]: I0930 12:42:18.586402 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerStarted","Data":"a284d04037fc82b693d6cf7811dc2baa7ee7acf23c85bc38fc4e0521c7872e59"} Sep 30 12:42:18 crc kubenswrapper[4672]: I0930 12:42:18.592396 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555aaf1f-ce58-4bf6-bade-9424a52fda72","Type":"ContainerStarted","Data":"8a0df1249ffb21d85aa536501281b9cd6915e73a2e4dced43a5f54007a1b882f"} Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.177566 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.202684 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.434716 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2" path="/var/lib/kubelet/pods/d61895ae-6f7c-4a90-8a4a-36c0fb09fcb2/volumes" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.606124 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555aaf1f-ce58-4bf6-bade-9424a52fda72","Type":"ContainerStarted","Data":"301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b"} Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.606165 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555aaf1f-ce58-4bf6-bade-9424a52fda72","Type":"ContainerStarted","Data":"312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6"} Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.612605 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerStarted","Data":"a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d"} Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.625711 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.632904 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.632883884 podStartE2EDuration="2.632883884s" podCreationTimestamp="2025-09-30 12:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:42:19.628688908 +0000 UTC m=+1230.897926574" watchObservedRunningTime="2025-09-30 12:42:19.632883884 +0000 UTC m=+1230.902121560" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.789184 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-z4wwh"] Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.790460 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.793797 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.794012 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.810631 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4wwh"] Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.916684 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-scripts\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.917090 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.917152 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhhf8\" (UniqueName: \"kubernetes.io/projected/abf258f0-0d20-49a2-8a89-dd52cfdda97e-kube-api-access-rhhf8\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:19 crc kubenswrapper[4672]: I0930 12:42:19.917307 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-config-data\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.018981 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.019033 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhhf8\" (UniqueName: \"kubernetes.io/projected/abf258f0-0d20-49a2-8a89-dd52cfdda97e-kube-api-access-rhhf8\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.019515 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-config-data\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.019642 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-scripts\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.024700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-scripts\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.024897 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.032022 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-config-data\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.058252 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhhf8\" (UniqueName: \"kubernetes.io/projected/abf258f0-0d20-49a2-8a89-dd52cfdda97e-kube-api-access-rhhf8\") pod \"nova-cell1-cell-mapping-z4wwh\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.119472 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:20 crc kubenswrapper[4672]: I0930 12:42:20.615308 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4wwh"] Sep 30 12:42:21 crc kubenswrapper[4672]: I0930 12:42:21.655922 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4wwh" event={"ID":"abf258f0-0d20-49a2-8a89-dd52cfdda97e","Type":"ContainerStarted","Data":"14803b06d879f0811e4ee0ec40174d5baed3839d680e7c6d0030e652fec94f14"} Sep 30 12:42:21 crc kubenswrapper[4672]: I0930 12:42:21.656479 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4wwh" event={"ID":"abf258f0-0d20-49a2-8a89-dd52cfdda97e","Type":"ContainerStarted","Data":"0dd7e6ecd9191abfc3e3d8c734efefa093bcf125a5907e57c4c0812db4b423a7"} Sep 30 12:42:21 crc kubenswrapper[4672]: I0930 12:42:21.662322 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerStarted","Data":"ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb"} Sep 30 12:42:21 crc kubenswrapper[4672]: I0930 12:42:21.662536 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:42:21 crc kubenswrapper[4672]: I0930 12:42:21.682376 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-z4wwh" podStartSLOduration=2.682358208 podStartE2EDuration="2.682358208s" podCreationTimestamp="2025-09-30 12:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:42:21.677669499 +0000 UTC m=+1232.946907155" watchObservedRunningTime="2025-09-30 12:42:21.682358208 +0000 UTC m=+1232.951595854" Sep 30 12:42:21 crc kubenswrapper[4672]: I0930 12:42:21.713438 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.653754552 podStartE2EDuration="5.713420972s" podCreationTimestamp="2025-09-30 12:42:16 +0000 UTC" firstStartedPulling="2025-09-30 12:42:17.673288467 +0000 UTC m=+1228.942526113" lastFinishedPulling="2025-09-30 12:42:20.732954887 +0000 UTC m=+1232.002192533" observedRunningTime="2025-09-30 12:42:21.712462788 +0000 UTC m=+1232.981700434" watchObservedRunningTime="2025-09-30 12:42:21.713420972 +0000 UTC m=+1232.982658638" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.025601 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.106097 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fbb457f-6lw7x"] Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.106446 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerName="dnsmasq-dns" containerID="cri-o://727b7ddf43148d767a5094fce03761f89c3e9425add9803090efbd4bf7f2eed1" gracePeriod=10 Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.683350 4672 generic.go:334] "Generic (PLEG): container finished" podID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerID="727b7ddf43148d767a5094fce03761f89c3e9425add9803090efbd4bf7f2eed1" exitCode=0 Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.683496 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" event={"ID":"becf3fb1-512b-41ac-bfa1-e0da6204bcda","Type":"ContainerDied","Data":"727b7ddf43148d767a5094fce03761f89c3e9425add9803090efbd4bf7f2eed1"} Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.684116 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" event={"ID":"becf3fb1-512b-41ac-bfa1-e0da6204bcda","Type":"ContainerDied","Data":"d8b681f7cfc279b7f90d1a590b8bd0749b24919c0fb5bb14a2149445db745e81"} Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.684144 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8b681f7cfc279b7f90d1a590b8bd0749b24919c0fb5bb14a2149445db745e81" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.779781 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.880199 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flnlg\" (UniqueName: \"kubernetes.io/projected/becf3fb1-512b-41ac-bfa1-e0da6204bcda-kube-api-access-flnlg\") pod \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.880258 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-config\") pod \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.880373 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-nb\") pod \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.880466 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-svc\") pod \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.880515 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-swift-storage-0\") pod \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.880565 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-sb\") pod \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\" (UID: \"becf3fb1-512b-41ac-bfa1-e0da6204bcda\") " Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.911985 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becf3fb1-512b-41ac-bfa1-e0da6204bcda-kube-api-access-flnlg" (OuterVolumeSpecName: "kube-api-access-flnlg") pod "becf3fb1-512b-41ac-bfa1-e0da6204bcda" (UID: "becf3fb1-512b-41ac-bfa1-e0da6204bcda"). InnerVolumeSpecName "kube-api-access-flnlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.953020 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "becf3fb1-512b-41ac-bfa1-e0da6204bcda" (UID: "becf3fb1-512b-41ac-bfa1-e0da6204bcda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.983219 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flnlg\" (UniqueName: \"kubernetes.io/projected/becf3fb1-512b-41ac-bfa1-e0da6204bcda-kube-api-access-flnlg\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.983356 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.986381 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-config" (OuterVolumeSpecName: "config") pod "becf3fb1-512b-41ac-bfa1-e0da6204bcda" (UID: "becf3fb1-512b-41ac-bfa1-e0da6204bcda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.996070 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "becf3fb1-512b-41ac-bfa1-e0da6204bcda" (UID: "becf3fb1-512b-41ac-bfa1-e0da6204bcda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:42:22 crc kubenswrapper[4672]: I0930 12:42:22.996786 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "becf3fb1-512b-41ac-bfa1-e0da6204bcda" (UID: "becf3fb1-512b-41ac-bfa1-e0da6204bcda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.006961 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "becf3fb1-512b-41ac-bfa1-e0da6204bcda" (UID: "becf3fb1-512b-41ac-bfa1-e0da6204bcda"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.088579 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.088615 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.088627 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.088636 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/becf3fb1-512b-41ac-bfa1-e0da6204bcda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.691711 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.719368 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fbb457f-6lw7x"] Sep 30 12:42:23 crc kubenswrapper[4672]: I0930 12:42:23.729079 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849fbb457f-6lw7x"] Sep 30 12:42:24 crc kubenswrapper[4672]: I0930 12:42:24.739226 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:42:24 crc kubenswrapper[4672]: I0930 12:42:24.739592 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:42:24 crc kubenswrapper[4672]: I0930 12:42:24.739635 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:42:24 crc kubenswrapper[4672]: I0930 12:42:24.740334 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2e0fa5817adc74311f8929edf2f7fe8a5d38b2926c430c80278916d7abc9d3a"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:42:24 crc kubenswrapper[4672]: I0930 12:42:24.740381 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://c2e0fa5817adc74311f8929edf2f7fe8a5d38b2926c430c80278916d7abc9d3a" gracePeriod=600 Sep 30 12:42:25 crc kubenswrapper[4672]: I0930 12:42:25.430733 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" path="/var/lib/kubelet/pods/becf3fb1-512b-41ac-bfa1-e0da6204bcda/volumes" Sep 30 12:42:25 crc kubenswrapper[4672]: I0930 12:42:25.711538 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="c2e0fa5817adc74311f8929edf2f7fe8a5d38b2926c430c80278916d7abc9d3a" exitCode=0 Sep 30 12:42:25 crc kubenswrapper[4672]: I0930 12:42:25.711583 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"c2e0fa5817adc74311f8929edf2f7fe8a5d38b2926c430c80278916d7abc9d3a"} Sep 30 12:42:25 crc kubenswrapper[4672]: I0930 12:42:25.711893 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"dae9c6261457b92ffd63bbf3a8283d8d78ab2aabf36d75dffc0e7ed0851a4fe4"} Sep 30 12:42:25 crc kubenswrapper[4672]: I0930 12:42:25.711918 4672 scope.go:117] "RemoveContainer" containerID="1712933e94420da648f449968b73ced3cfbd2790d2d92518ca79624030de9f70" Sep 30 12:42:27 crc kubenswrapper[4672]: I0930 12:42:27.519451 4672 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-849fbb457f-6lw7x" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.207:5353: i/o timeout" Sep 30 12:42:27 crc kubenswrapper[4672]: I0930 12:42:27.736585 4672 generic.go:334] "Generic (PLEG): container finished" podID="abf258f0-0d20-49a2-8a89-dd52cfdda97e" containerID="14803b06d879f0811e4ee0ec40174d5baed3839d680e7c6d0030e652fec94f14" exitCode=0 Sep 30 12:42:27 crc kubenswrapper[4672]: I0930 12:42:27.736624 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4wwh" event={"ID":"abf258f0-0d20-49a2-8a89-dd52cfdda97e","Type":"ContainerDied","Data":"14803b06d879f0811e4ee0ec40174d5baed3839d680e7c6d0030e652fec94f14"} Sep 30 12:42:28 crc kubenswrapper[4672]: I0930 12:42:28.033420 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:42:28 crc kubenswrapper[4672]: I0930 12:42:28.033735 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.053459 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.053458 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.183998 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.334603 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-combined-ca-bundle\") pod \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.334716 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-config-data\") pod \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.334898 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhhf8\" (UniqueName: \"kubernetes.io/projected/abf258f0-0d20-49a2-8a89-dd52cfdda97e-kube-api-access-rhhf8\") pod \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.334957 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-scripts\") pod \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\" (UID: \"abf258f0-0d20-49a2-8a89-dd52cfdda97e\") " Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.362023 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-scripts" (OuterVolumeSpecName: "scripts") pod "abf258f0-0d20-49a2-8a89-dd52cfdda97e" (UID: "abf258f0-0d20-49a2-8a89-dd52cfdda97e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.364619 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf258f0-0d20-49a2-8a89-dd52cfdda97e-kube-api-access-rhhf8" (OuterVolumeSpecName: "kube-api-access-rhhf8") pod "abf258f0-0d20-49a2-8a89-dd52cfdda97e" (UID: "abf258f0-0d20-49a2-8a89-dd52cfdda97e"). InnerVolumeSpecName "kube-api-access-rhhf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.368406 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-config-data" (OuterVolumeSpecName: "config-data") pod "abf258f0-0d20-49a2-8a89-dd52cfdda97e" (UID: "abf258f0-0d20-49a2-8a89-dd52cfdda97e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.368731 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf258f0-0d20-49a2-8a89-dd52cfdda97e" (UID: "abf258f0-0d20-49a2-8a89-dd52cfdda97e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.437599 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhhf8\" (UniqueName: \"kubernetes.io/projected/abf258f0-0d20-49a2-8a89-dd52cfdda97e-kube-api-access-rhhf8\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.437638 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.437652 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.437666 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf258f0-0d20-49a2-8a89-dd52cfdda97e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.763221 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4wwh" event={"ID":"abf258f0-0d20-49a2-8a89-dd52cfdda97e","Type":"ContainerDied","Data":"0dd7e6ecd9191abfc3e3d8c734efefa093bcf125a5907e57c4c0812db4b423a7"} Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.763291 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dd7e6ecd9191abfc3e3d8c734efefa093bcf125a5907e57c4c0812db4b423a7" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.763351 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4wwh" Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.955099 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.955307 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-log" containerID="cri-o://312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6" gracePeriod=30 Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.955636 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-api" containerID="cri-o://301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b" gracePeriod=30 Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.995604 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.995864 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-log" containerID="cri-o://cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a" gracePeriod=30 Sep 30 12:42:29 crc kubenswrapper[4672]: I0930 12:42:29.996305 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-metadata" containerID="cri-o://c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d" gracePeriod=30 Sep 30 12:42:30 crc kubenswrapper[4672]: I0930 12:42:30.007467 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:42:30 crc kubenswrapper[4672]: I0930 12:42:30.007833 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fbd3d6af-41ae-43fe-8d4b-10b98894fd99" containerName="nova-scheduler-scheduler" containerID="cri-o://3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc" gracePeriod=30 Sep 30 12:42:30 crc kubenswrapper[4672]: I0930 12:42:30.776865 4672 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerID="cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a" exitCode=143 Sep 30 12:42:30 crc kubenswrapper[4672]: I0930 12:42:30.776926 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7d72ea6-c355-4e07-99da-88f9ff5cd342","Type":"ContainerDied","Data":"cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a"} Sep 30 12:42:30 crc kubenswrapper[4672]: I0930 12:42:30.778288 4672 generic.go:334] "Generic (PLEG): container finished" podID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerID="312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6" exitCode=143 Sep 30 12:42:30 crc kubenswrapper[4672]: I0930 12:42:30.778339 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555aaf1f-ce58-4bf6-bade-9424a52fda72","Type":"ContainerDied","Data":"312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6"} Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.356043 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.482691 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d72ea6-c355-4e07-99da-88f9ff5cd342-logs\") pod \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.482762 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7xvw\" (UniqueName: \"kubernetes.io/projected/a7d72ea6-c355-4e07-99da-88f9ff5cd342-kube-api-access-z7xvw\") pod \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.482840 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-nova-metadata-tls-certs\") pod \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.482897 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-combined-ca-bundle\") pod \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.482943 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-config-data\") pod \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\" (UID: \"a7d72ea6-c355-4e07-99da-88f9ff5cd342\") " Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.483584 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d72ea6-c355-4e07-99da-88f9ff5cd342-logs" (OuterVolumeSpecName: "logs") pod "a7d72ea6-c355-4e07-99da-88f9ff5cd342" (UID: "a7d72ea6-c355-4e07-99da-88f9ff5cd342"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.504525 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d72ea6-c355-4e07-99da-88f9ff5cd342-kube-api-access-z7xvw" (OuterVolumeSpecName: "kube-api-access-z7xvw") pod "a7d72ea6-c355-4e07-99da-88f9ff5cd342" (UID: "a7d72ea6-c355-4e07-99da-88f9ff5cd342"). InnerVolumeSpecName "kube-api-access-z7xvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.522695 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-config-data" (OuterVolumeSpecName: "config-data") pod "a7d72ea6-c355-4e07-99da-88f9ff5cd342" (UID: "a7d72ea6-c355-4e07-99da-88f9ff5cd342"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.532617 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7d72ea6-c355-4e07-99da-88f9ff5cd342" (UID: "a7d72ea6-c355-4e07-99da-88f9ff5cd342"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.545499 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a7d72ea6-c355-4e07-99da-88f9ff5cd342" (UID: "a7d72ea6-c355-4e07-99da-88f9ff5cd342"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.585620 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.585849 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.585943 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d72ea6-c355-4e07-99da-88f9ff5cd342-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.586030 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7xvw\" (UniqueName: \"kubernetes.io/projected/a7d72ea6-c355-4e07-99da-88f9ff5cd342-kube-api-access-z7xvw\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.586083 4672 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d72ea6-c355-4e07-99da-88f9ff5cd342-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.792314 4672 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerID="c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d" exitCode=0 Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.792364 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7d72ea6-c355-4e07-99da-88f9ff5cd342","Type":"ContainerDied","Data":"c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d"} Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.792425 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7d72ea6-c355-4e07-99da-88f9ff5cd342","Type":"ContainerDied","Data":"f3d73a1dbf607e16d377f77f9d48876eb71e451c9871e6779a271f8b5bc30b27"} Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.792460 4672 scope.go:117] "RemoveContainer" containerID="c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.792474 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.911418 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.921360 4672 scope.go:117] "RemoveContainer" containerID="cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.921850 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.952570 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:42:31 crc kubenswrapper[4672]: E0930 12:42:31.952972 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-metadata" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.952994 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-metadata" Sep 30 12:42:31 crc kubenswrapper[4672]: E0930 12:42:31.953015 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerName="dnsmasq-dns" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953023 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerName="dnsmasq-dns" Sep 30 12:42:31 crc kubenswrapper[4672]: E0930 12:42:31.953037 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-log" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953043 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-log" Sep 30 12:42:31 crc kubenswrapper[4672]: E0930 12:42:31.953057 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerName="init" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953066 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerName="init" Sep 30 12:42:31 crc kubenswrapper[4672]: E0930 12:42:31.953098 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf258f0-0d20-49a2-8a89-dd52cfdda97e" containerName="nova-manage" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953106 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf258f0-0d20-49a2-8a89-dd52cfdda97e" containerName="nova-manage" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953339 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf258f0-0d20-49a2-8a89-dd52cfdda97e" containerName="nova-manage" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953366 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="becf3fb1-512b-41ac-bfa1-e0da6204bcda" containerName="dnsmasq-dns" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953381 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-log" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.953393 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" containerName="nova-metadata-metadata" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.954502 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.962438 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.963565 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.979423 4672 scope.go:117] "RemoveContainer" containerID="c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d" Sep 30 12:42:31 crc kubenswrapper[4672]: E0930 12:42:31.980177 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d\": container with ID starting with c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d not found: ID does not exist" containerID="c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.980206 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d"} err="failed to get container status \"c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d\": rpc error: code = NotFound desc = could not find container \"c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d\": container with ID starting with c5f69a7f98cb6207d4cf5bbdf763dc1203c2b91cbd71fe07f3f081680a77212d not found: ID does not exist" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.980225 4672 scope.go:117] "RemoveContainer" containerID="cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.980251 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:42:31 crc kubenswrapper[4672]: E0930 12:42:31.983128 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a\": container with ID starting with cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a not found: ID does not exist" containerID="cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a" Sep 30 12:42:31 crc kubenswrapper[4672]: I0930 12:42:31.983189 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a"} err="failed to get container status \"cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a\": rpc error: code = NotFound desc = could not find container \"cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a\": container with ID starting with cf2e758e9698a4f192d2d69821cf139f155e8e4ddf7b3aa543cb91b4a1e7823a not found: ID does not exist" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.102571 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-config-data\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.102678 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.102754 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68dp4\" (UniqueName: \"kubernetes.io/projected/db8c9818-e7bc-471f-b7d6-b097f3657451-kube-api-access-68dp4\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.102858 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8c9818-e7bc-471f-b7d6-b097f3657451-logs\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.102959 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.175522 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.204629 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.204726 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68dp4\" (UniqueName: \"kubernetes.io/projected/db8c9818-e7bc-471f-b7d6-b097f3657451-kube-api-access-68dp4\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.205061 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8c9818-e7bc-471f-b7d6-b097f3657451-logs\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.205185 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.205282 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-config-data\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.205767 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8c9818-e7bc-471f-b7d6-b097f3657451-logs\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.215820 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-config-data\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.224944 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.226681 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8c9818-e7bc-471f-b7d6-b097f3657451-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.226700 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68dp4\" (UniqueName: \"kubernetes.io/projected/db8c9818-e7bc-471f-b7d6-b097f3657451-kube-api-access-68dp4\") pod \"nova-metadata-0\" (UID: \"db8c9818-e7bc-471f-b7d6-b097f3657451\") " pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.285216 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.308004 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-combined-ca-bundle\") pod \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.308061 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpfbl\" (UniqueName: \"kubernetes.io/projected/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-kube-api-access-hpfbl\") pod \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.308683 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-config-data\") pod \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\" (UID: \"fbd3d6af-41ae-43fe-8d4b-10b98894fd99\") " Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.312576 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-kube-api-access-hpfbl" (OuterVolumeSpecName: "kube-api-access-hpfbl") pod "fbd3d6af-41ae-43fe-8d4b-10b98894fd99" (UID: "fbd3d6af-41ae-43fe-8d4b-10b98894fd99"). InnerVolumeSpecName "kube-api-access-hpfbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.338007 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbd3d6af-41ae-43fe-8d4b-10b98894fd99" (UID: "fbd3d6af-41ae-43fe-8d4b-10b98894fd99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.356408 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-config-data" (OuterVolumeSpecName: "config-data") pod "fbd3d6af-41ae-43fe-8d4b-10b98894fd99" (UID: "fbd3d6af-41ae-43fe-8d4b-10b98894fd99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.412408 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.412448 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpfbl\" (UniqueName: \"kubernetes.io/projected/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-kube-api-access-hpfbl\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.412464 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbd3d6af-41ae-43fe-8d4b-10b98894fd99-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:32 crc kubenswrapper[4672]: W0930 12:42:32.733953 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb8c9818_e7bc_471f_b7d6_b097f3657451.slice/crio-1e012d6813ddfca92c1b114262b17360ef6f2c98bdfacac10b214cdeeb4ad336 WatchSource:0}: Error finding container 1e012d6813ddfca92c1b114262b17360ef6f2c98bdfacac10b214cdeeb4ad336: Status 404 returned error can't find the container with id 1e012d6813ddfca92c1b114262b17360ef6f2c98bdfacac10b214cdeeb4ad336 Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.734181 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.800287 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db8c9818-e7bc-471f-b7d6-b097f3657451","Type":"ContainerStarted","Data":"1e012d6813ddfca92c1b114262b17360ef6f2c98bdfacac10b214cdeeb4ad336"} Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.803527 4672 generic.go:334] "Generic (PLEG): container finished" podID="fbd3d6af-41ae-43fe-8d4b-10b98894fd99" containerID="3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc" exitCode=0 Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.803554 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fbd3d6af-41ae-43fe-8d4b-10b98894fd99","Type":"ContainerDied","Data":"3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc"} Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.803570 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fbd3d6af-41ae-43fe-8d4b-10b98894fd99","Type":"ContainerDied","Data":"d8a0480183553602603bb389a71d555b9cd94219c1a1197740f19834edd4203e"} Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.803586 4672 scope.go:117] "RemoveContainer" containerID="3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.803682 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.843791 4672 scope.go:117] "RemoveContainer" containerID="3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc" Sep 30 12:42:32 crc kubenswrapper[4672]: E0930 12:42:32.844105 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc\": container with ID starting with 3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc not found: ID does not exist" containerID="3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.844132 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc"} err="failed to get container status \"3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc\": rpc error: code = NotFound desc = could not find container \"3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc\": container with ID starting with 3433eecdf433182af87a890ac552ffe7464a6c021d8b9e26a8efc58d1a7578fc not found: ID does not exist" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.881068 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.900391 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.909982 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:42:32 crc kubenswrapper[4672]: E0930 12:42:32.910456 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd3d6af-41ae-43fe-8d4b-10b98894fd99" containerName="nova-scheduler-scheduler" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.910483 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd3d6af-41ae-43fe-8d4b-10b98894fd99" containerName="nova-scheduler-scheduler" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.910716 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd3d6af-41ae-43fe-8d4b-10b98894fd99" containerName="nova-scheduler-scheduler" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.911410 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.913901 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 12:42:32 crc kubenswrapper[4672]: I0930 12:42:32.919646 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.023538 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e19e97-1725-4846-93ef-b00bf092908b-config-data\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.023609 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2njr6\" (UniqueName: \"kubernetes.io/projected/e9e19e97-1725-4846-93ef-b00bf092908b-kube-api-access-2njr6\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.023649 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e19e97-1725-4846-93ef-b00bf092908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.125544 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e19e97-1725-4846-93ef-b00bf092908b-config-data\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.125924 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2njr6\" (UniqueName: \"kubernetes.io/projected/e9e19e97-1725-4846-93ef-b00bf092908b-kube-api-access-2njr6\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.125966 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e19e97-1725-4846-93ef-b00bf092908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.129554 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e19e97-1725-4846-93ef-b00bf092908b-config-data\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.129746 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e19e97-1725-4846-93ef-b00bf092908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.158330 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2njr6\" (UniqueName: \"kubernetes.io/projected/e9e19e97-1725-4846-93ef-b00bf092908b-kube-api-access-2njr6\") pod \"nova-scheduler-0\" (UID: \"e9e19e97-1725-4846-93ef-b00bf092908b\") " pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.246817 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.433988 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d72ea6-c355-4e07-99da-88f9ff5cd342" path="/var/lib/kubelet/pods/a7d72ea6-c355-4e07-99da-88f9ff5cd342/volumes" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.434938 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd3d6af-41ae-43fe-8d4b-10b98894fd99" path="/var/lib/kubelet/pods/fbd3d6af-41ae-43fe-8d4b-10b98894fd99/volumes" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.531596 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.639045 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-combined-ca-bundle\") pod \"555aaf1f-ce58-4bf6-bade-9424a52fda72\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.639141 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-internal-tls-certs\") pod \"555aaf1f-ce58-4bf6-bade-9424a52fda72\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.639328 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-config-data\") pod \"555aaf1f-ce58-4bf6-bade-9424a52fda72\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.639381 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-public-tls-certs\") pod \"555aaf1f-ce58-4bf6-bade-9424a52fda72\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.639441 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lndz\" (UniqueName: \"kubernetes.io/projected/555aaf1f-ce58-4bf6-bade-9424a52fda72-kube-api-access-2lndz\") pod \"555aaf1f-ce58-4bf6-bade-9424a52fda72\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.639980 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555aaf1f-ce58-4bf6-bade-9424a52fda72-logs\") pod \"555aaf1f-ce58-4bf6-bade-9424a52fda72\" (UID: \"555aaf1f-ce58-4bf6-bade-9424a52fda72\") " Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.640518 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/555aaf1f-ce58-4bf6-bade-9424a52fda72-logs" (OuterVolumeSpecName: "logs") pod "555aaf1f-ce58-4bf6-bade-9424a52fda72" (UID: "555aaf1f-ce58-4bf6-bade-9424a52fda72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.641473 4672 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/555aaf1f-ce58-4bf6-bade-9424a52fda72-logs\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.644416 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555aaf1f-ce58-4bf6-bade-9424a52fda72-kube-api-access-2lndz" (OuterVolumeSpecName: "kube-api-access-2lndz") pod "555aaf1f-ce58-4bf6-bade-9424a52fda72" (UID: "555aaf1f-ce58-4bf6-bade-9424a52fda72"). InnerVolumeSpecName "kube-api-access-2lndz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.666230 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-config-data" (OuterVolumeSpecName: "config-data") pod "555aaf1f-ce58-4bf6-bade-9424a52fda72" (UID: "555aaf1f-ce58-4bf6-bade-9424a52fda72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.666357 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "555aaf1f-ce58-4bf6-bade-9424a52fda72" (UID: "555aaf1f-ce58-4bf6-bade-9424a52fda72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.695990 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "555aaf1f-ce58-4bf6-bade-9424a52fda72" (UID: "555aaf1f-ce58-4bf6-bade-9424a52fda72"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.697839 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "555aaf1f-ce58-4bf6-bade-9424a52fda72" (UID: "555aaf1f-ce58-4bf6-bade-9424a52fda72"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.747220 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.747294 4672 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.747313 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.747328 4672 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/555aaf1f-ce58-4bf6-bade-9424a52fda72-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.747346 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lndz\" (UniqueName: \"kubernetes.io/projected/555aaf1f-ce58-4bf6-bade-9424a52fda72-kube-api-access-2lndz\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.786805 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 12:42:33 crc kubenswrapper[4672]: W0930 12:42:33.786805 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9e19e97_1725_4846_93ef_b00bf092908b.slice/crio-d390c5513c53362b0bbb561fb769723bda7d8cb6b56b9a5115b0ac746024b521 WatchSource:0}: Error finding container d390c5513c53362b0bbb561fb769723bda7d8cb6b56b9a5115b0ac746024b521: Status 404 returned error can't find the container with id d390c5513c53362b0bbb561fb769723bda7d8cb6b56b9a5115b0ac746024b521 Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.814051 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db8c9818-e7bc-471f-b7d6-b097f3657451","Type":"ContainerStarted","Data":"cf2e6f003368e3d016b82af89f082abac6b00f20a5e8b5f0b79040b63111fb1c"} Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.814085 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db8c9818-e7bc-471f-b7d6-b097f3657451","Type":"ContainerStarted","Data":"24660a65ebf0d1fac82a6ad70d912c357d99f8511600b54528178361872ee534"} Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.817722 4672 generic.go:334] "Generic (PLEG): container finished" podID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerID="301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b" exitCode=0 Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.817753 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.817789 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555aaf1f-ce58-4bf6-bade-9424a52fda72","Type":"ContainerDied","Data":"301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b"} Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.817822 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"555aaf1f-ce58-4bf6-bade-9424a52fda72","Type":"ContainerDied","Data":"8a0df1249ffb21d85aa536501281b9cd6915e73a2e4dced43a5f54007a1b882f"} Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.817841 4672 scope.go:117] "RemoveContainer" containerID="301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.820748 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9e19e97-1725-4846-93ef-b00bf092908b","Type":"ContainerStarted","Data":"d390c5513c53362b0bbb561fb769723bda7d8cb6b56b9a5115b0ac746024b521"} Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.838064 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.838049807 podStartE2EDuration="2.838049807s" podCreationTimestamp="2025-09-30 12:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:42:33.829299324 +0000 UTC m=+1245.098536960" watchObservedRunningTime="2025-09-30 12:42:33.838049807 +0000 UTC m=+1245.107287453" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.849590 4672 scope.go:117] "RemoveContainer" containerID="312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.871389 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.883075 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.883219 4672 scope.go:117] "RemoveContainer" containerID="301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b" Sep 30 12:42:33 crc kubenswrapper[4672]: E0930 12:42:33.883627 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b\": container with ID starting with 301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b not found: ID does not exist" containerID="301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.883656 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b"} err="failed to get container status \"301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b\": rpc error: code = NotFound desc = could not find container \"301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b\": container with ID starting with 301195ab898db5d99ed204951efcb863e1d71c7471682a73aa1af10a6a2eb25b not found: ID does not exist" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.883675 4672 scope.go:117] "RemoveContainer" containerID="312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6" Sep 30 12:42:33 crc kubenswrapper[4672]: E0930 12:42:33.883869 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6\": container with ID starting with 312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6 not found: ID does not exist" containerID="312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.883972 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6"} err="failed to get container status \"312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6\": rpc error: code = NotFound desc = could not find container \"312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6\": container with ID starting with 312b39ae3736a98c292a0ddd0552e6d41d45ac1f475f54518554962adf2ab8e6 not found: ID does not exist" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.891293 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:33 crc kubenswrapper[4672]: E0930 12:42:33.891701 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-log" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.891717 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-log" Sep 30 12:42:33 crc kubenswrapper[4672]: E0930 12:42:33.891761 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-api" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.891767 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-api" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.891965 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-api" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.892003 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" containerName="nova-api-log" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.893186 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.895099 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.895479 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.895734 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.905859 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.950861 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-public-tls-certs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.951191 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.951317 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.952093 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77091ef9-bf9b-4b0b-aacd-c46a576974a8-logs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.952282 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-config-data\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:33 crc kubenswrapper[4672]: I0930 12:42:33.952421 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqbgq\" (UniqueName: \"kubernetes.io/projected/77091ef9-bf9b-4b0b-aacd-c46a576974a8-kube-api-access-fqbgq\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.054332 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.054414 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.054494 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77091ef9-bf9b-4b0b-aacd-c46a576974a8-logs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.054550 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-config-data\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.054624 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqbgq\" (UniqueName: \"kubernetes.io/projected/77091ef9-bf9b-4b0b-aacd-c46a576974a8-kube-api-access-fqbgq\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.055403 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77091ef9-bf9b-4b0b-aacd-c46a576974a8-logs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.055616 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-public-tls-certs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.059806 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-config-data\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.059883 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.061946 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-public-tls-certs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.064173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77091ef9-bf9b-4b0b-aacd-c46a576974a8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.071146 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqbgq\" (UniqueName: \"kubernetes.io/projected/77091ef9-bf9b-4b0b-aacd-c46a576974a8-kube-api-access-fqbgq\") pod \"nova-api-0\" (UID: \"77091ef9-bf9b-4b0b-aacd-c46a576974a8\") " pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.219050 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.685629 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.834679 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9e19e97-1725-4846-93ef-b00bf092908b","Type":"ContainerStarted","Data":"58ab9b9c3533088ad5c3259f9f6c1e3b788132ddce51c4d191f166db7007cd77"} Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.838067 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"77091ef9-bf9b-4b0b-aacd-c46a576974a8","Type":"ContainerStarted","Data":"9a239f62b12779fd5e67b2a469fcc61d60cc83dad7c3f9991b59a5fbf682230a"} Sep 30 12:42:34 crc kubenswrapper[4672]: I0930 12:42:34.856698 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.856678873 podStartE2EDuration="2.856678873s" podCreationTimestamp="2025-09-30 12:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:42:34.852415854 +0000 UTC m=+1246.121653540" watchObservedRunningTime="2025-09-30 12:42:34.856678873 +0000 UTC m=+1246.125916529" Sep 30 12:42:35 crc kubenswrapper[4672]: I0930 12:42:35.428072 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555aaf1f-ce58-4bf6-bade-9424a52fda72" path="/var/lib/kubelet/pods/555aaf1f-ce58-4bf6-bade-9424a52fda72/volumes" Sep 30 12:42:35 crc kubenswrapper[4672]: I0930 12:42:35.852409 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"77091ef9-bf9b-4b0b-aacd-c46a576974a8","Type":"ContainerStarted","Data":"bb34bdacae36eea9c6b159b89fda734ec27ce9a539e78112946c5af863b6d831"} Sep 30 12:42:35 crc kubenswrapper[4672]: I0930 12:42:35.852488 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"77091ef9-bf9b-4b0b-aacd-c46a576974a8","Type":"ContainerStarted","Data":"55451a4c109cd44ec4b3fa0d7a8f70b5dbca069e68ac429f72c19ea6bee79325"} Sep 30 12:42:35 crc kubenswrapper[4672]: I0930 12:42:35.880347 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.880326968 podStartE2EDuration="2.880326968s" podCreationTimestamp="2025-09-30 12:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:42:35.867215813 +0000 UTC m=+1247.136453499" watchObservedRunningTime="2025-09-30 12:42:35.880326968 +0000 UTC m=+1247.149564624" Sep 30 12:42:37 crc kubenswrapper[4672]: I0930 12:42:37.285741 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 12:42:37 crc kubenswrapper[4672]: I0930 12:42:37.286067 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 12:42:38 crc kubenswrapper[4672]: I0930 12:42:38.248010 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 12:42:42 crc kubenswrapper[4672]: I0930 12:42:42.286338 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 12:42:42 crc kubenswrapper[4672]: I0930 12:42:42.287152 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 12:42:43 crc kubenswrapper[4672]: I0930 12:42:43.247440 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 12:42:43 crc kubenswrapper[4672]: I0930 12:42:43.281117 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 12:42:43 crc kubenswrapper[4672]: I0930 12:42:43.303429 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="db8c9818-e7bc-471f-b7d6-b097f3657451" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:43 crc kubenswrapper[4672]: I0930 12:42:43.303486 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="db8c9818-e7bc-471f-b7d6-b097f3657451" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:43 crc kubenswrapper[4672]: I0930 12:42:43.965458 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 12:42:44 crc kubenswrapper[4672]: I0930 12:42:44.219525 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:42:44 crc kubenswrapper[4672]: I0930 12:42:44.219591 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 12:42:45 crc kubenswrapper[4672]: I0930 12:42:45.232492 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="77091ef9-bf9b-4b0b-aacd-c46a576974a8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:45 crc kubenswrapper[4672]: I0930 12:42:45.232492 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="77091ef9-bf9b-4b0b-aacd-c46a576974a8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 12:42:47 crc kubenswrapper[4672]: I0930 12:42:47.170750 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 12:42:50 crc kubenswrapper[4672]: I0930 12:42:50.796883 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:42:50 crc kubenswrapper[4672]: I0930 12:42:50.797398 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8227da12-ad04-4956-bc6e-8bc6b49475a4" containerName="kube-state-metrics" containerID="cri-o://095606d69855d83f14140aae3133c63d23ffb62d9965c96fd951803a6a876cb9" gracePeriod=30 Sep 30 12:42:51 crc kubenswrapper[4672]: I0930 12:42:51.020813 4672 generic.go:334] "Generic (PLEG): container finished" podID="8227da12-ad04-4956-bc6e-8bc6b49475a4" containerID="095606d69855d83f14140aae3133c63d23ffb62d9965c96fd951803a6a876cb9" exitCode=2 Sep 30 12:42:51 crc kubenswrapper[4672]: I0930 12:42:51.020863 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8227da12-ad04-4956-bc6e-8bc6b49475a4","Type":"ContainerDied","Data":"095606d69855d83f14140aae3133c63d23ffb62d9965c96fd951803a6a876cb9"} Sep 30 12:42:51 crc kubenswrapper[4672]: I0930 12:42:51.295556 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 12:42:51 crc kubenswrapper[4672]: I0930 12:42:51.440675 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt6cg\" (UniqueName: \"kubernetes.io/projected/8227da12-ad04-4956-bc6e-8bc6b49475a4-kube-api-access-lt6cg\") pod \"8227da12-ad04-4956-bc6e-8bc6b49475a4\" (UID: \"8227da12-ad04-4956-bc6e-8bc6b49475a4\") " Sep 30 12:42:51 crc kubenswrapper[4672]: I0930 12:42:51.447980 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8227da12-ad04-4956-bc6e-8bc6b49475a4-kube-api-access-lt6cg" (OuterVolumeSpecName: "kube-api-access-lt6cg") pod "8227da12-ad04-4956-bc6e-8bc6b49475a4" (UID: "8227da12-ad04-4956-bc6e-8bc6b49475a4"). InnerVolumeSpecName "kube-api-access-lt6cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:51 crc kubenswrapper[4672]: I0930 12:42:51.542918 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt6cg\" (UniqueName: \"kubernetes.io/projected/8227da12-ad04-4956-bc6e-8bc6b49475a4-kube-api-access-lt6cg\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.031588 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8227da12-ad04-4956-bc6e-8bc6b49475a4","Type":"ContainerDied","Data":"f7e81a5f7deebbf74d2f25fc4719169d6b25af9e4b802e956181f9d95885dfb4"} Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.031635 4672 scope.go:117] "RemoveContainer" containerID="095606d69855d83f14140aae3133c63d23ffb62d9965c96fd951803a6a876cb9" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.033986 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.087680 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.096762 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.104248 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:42:52 crc kubenswrapper[4672]: E0930 12:42:52.104828 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8227da12-ad04-4956-bc6e-8bc6b49475a4" containerName="kube-state-metrics" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.104845 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8227da12-ad04-4956-bc6e-8bc6b49475a4" containerName="kube-state-metrics" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.105193 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8227da12-ad04-4956-bc6e-8bc6b49475a4" containerName="kube-state-metrics" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.106133 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.110018 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.110297 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.113375 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.260355 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.260684 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hqm\" (UniqueName: \"kubernetes.io/projected/32acef5a-c440-4574-9a53-18754f15acc6-kube-api-access-d9hqm\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.260762 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.260818 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.292670 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.293572 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.299906 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.363015 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.363161 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.363212 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hqm\" (UniqueName: \"kubernetes.io/projected/32acef5a-c440-4574-9a53-18754f15acc6-kube-api-access-d9hqm\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.363283 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.369134 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.369499 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.369920 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32acef5a-c440-4574-9a53-18754f15acc6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.387042 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hqm\" (UniqueName: \"kubernetes.io/projected/32acef5a-c440-4574-9a53-18754f15acc6-kube-api-access-d9hqm\") pod \"kube-state-metrics-0\" (UID: \"32acef5a-c440-4574-9a53-18754f15acc6\") " pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.435881 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.782643 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.783414 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-central-agent" containerID="cri-o://4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6" gracePeriod=30 Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.783487 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="proxy-httpd" containerID="cri-o://ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb" gracePeriod=30 Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.783414 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="sg-core" containerID="cri-o://a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d" gracePeriod=30 Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.783487 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-notification-agent" containerID="cri-o://f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d" gracePeriod=30 Sep 30 12:42:52 crc kubenswrapper[4672]: I0930 12:42:52.909095 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 12:42:52 crc kubenswrapper[4672]: W0930 12:42:52.918615 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32acef5a_c440_4574_9a53_18754f15acc6.slice/crio-95c187e9a57c6237f0d19ca1965146be53e9c5e120722e7bf7c824b1168eb684 WatchSource:0}: Error finding container 95c187e9a57c6237f0d19ca1965146be53e9c5e120722e7bf7c824b1168eb684: Status 404 returned error can't find the container with id 95c187e9a57c6237f0d19ca1965146be53e9c5e120722e7bf7c824b1168eb684 Sep 30 12:42:53 crc kubenswrapper[4672]: I0930 12:42:53.043034 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32acef5a-c440-4574-9a53-18754f15acc6","Type":"ContainerStarted","Data":"95c187e9a57c6237f0d19ca1965146be53e9c5e120722e7bf7c824b1168eb684"} Sep 30 12:42:53 crc kubenswrapper[4672]: I0930 12:42:53.045875 4672 generic.go:334] "Generic (PLEG): container finished" podID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerID="ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb" exitCode=0 Sep 30 12:42:53 crc kubenswrapper[4672]: I0930 12:42:53.045906 4672 generic.go:334] "Generic (PLEG): container finished" podID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerID="a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d" exitCode=2 Sep 30 12:42:53 crc kubenswrapper[4672]: I0930 12:42:53.045937 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerDied","Data":"ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb"} Sep 30 12:42:53 crc kubenswrapper[4672]: I0930 12:42:53.045999 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerDied","Data":"a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d"} Sep 30 12:42:53 crc kubenswrapper[4672]: I0930 12:42:53.052256 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 12:42:53 crc kubenswrapper[4672]: I0930 12:42:53.428889 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8227da12-ad04-4956-bc6e-8bc6b49475a4" path="/var/lib/kubelet/pods/8227da12-ad04-4956-bc6e-8bc6b49475a4/volumes" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.056565 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"32acef5a-c440-4574-9a53-18754f15acc6","Type":"ContainerStarted","Data":"50b7c48ac97337e367746c8a693521426bbe589f2f1990e667035e259f8afb77"} Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.057222 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.060298 4672 generic.go:334] "Generic (PLEG): container finished" podID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerID="4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6" exitCode=0 Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.060799 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerDied","Data":"4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6"} Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.238644 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.240049 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.248191 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.254918 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.265497 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8909607720000001 podStartE2EDuration="2.265475974s" podCreationTimestamp="2025-09-30 12:42:52 +0000 UTC" firstStartedPulling="2025-09-30 12:42:52.924760473 +0000 UTC m=+1264.193998129" lastFinishedPulling="2025-09-30 12:42:53.299275685 +0000 UTC m=+1264.568513331" observedRunningTime="2025-09-30 12:42:54.074843566 +0000 UTC m=+1265.344081252" watchObservedRunningTime="2025-09-30 12:42:54.265475974 +0000 UTC m=+1265.534713620" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.575723 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708223 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-sg-core-conf-yaml\") pod \"4301988f-bce9-466c-aff1-88d33d37a9cd\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708316 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-log-httpd\") pod \"4301988f-bce9-466c-aff1-88d33d37a9cd\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708333 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-config-data\") pod \"4301988f-bce9-466c-aff1-88d33d37a9cd\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708411 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-run-httpd\") pod \"4301988f-bce9-466c-aff1-88d33d37a9cd\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708484 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-combined-ca-bundle\") pod \"4301988f-bce9-466c-aff1-88d33d37a9cd\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708561 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgm7c\" (UniqueName: \"kubernetes.io/projected/4301988f-bce9-466c-aff1-88d33d37a9cd-kube-api-access-tgm7c\") pod \"4301988f-bce9-466c-aff1-88d33d37a9cd\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708677 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-scripts\") pod \"4301988f-bce9-466c-aff1-88d33d37a9cd\" (UID: \"4301988f-bce9-466c-aff1-88d33d37a9cd\") " Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.708827 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4301988f-bce9-466c-aff1-88d33d37a9cd" (UID: "4301988f-bce9-466c-aff1-88d33d37a9cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.709410 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4301988f-bce9-466c-aff1-88d33d37a9cd" (UID: "4301988f-bce9-466c-aff1-88d33d37a9cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.709576 4672 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.709596 4672 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4301988f-bce9-466c-aff1-88d33d37a9cd-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.714749 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4301988f-bce9-466c-aff1-88d33d37a9cd-kube-api-access-tgm7c" (OuterVolumeSpecName: "kube-api-access-tgm7c") pod "4301988f-bce9-466c-aff1-88d33d37a9cd" (UID: "4301988f-bce9-466c-aff1-88d33d37a9cd"). InnerVolumeSpecName "kube-api-access-tgm7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.730599 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-scripts" (OuterVolumeSpecName: "scripts") pod "4301988f-bce9-466c-aff1-88d33d37a9cd" (UID: "4301988f-bce9-466c-aff1-88d33d37a9cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.737565 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4301988f-bce9-466c-aff1-88d33d37a9cd" (UID: "4301988f-bce9-466c-aff1-88d33d37a9cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.789651 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4301988f-bce9-466c-aff1-88d33d37a9cd" (UID: "4301988f-bce9-466c-aff1-88d33d37a9cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.810997 4672 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.811037 4672 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.811051 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.811063 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgm7c\" (UniqueName: \"kubernetes.io/projected/4301988f-bce9-466c-aff1-88d33d37a9cd-kube-api-access-tgm7c\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.823684 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-config-data" (OuterVolumeSpecName: "config-data") pod "4301988f-bce9-466c-aff1-88d33d37a9cd" (UID: "4301988f-bce9-466c-aff1-88d33d37a9cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:42:54 crc kubenswrapper[4672]: I0930 12:42:54.913330 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4301988f-bce9-466c-aff1-88d33d37a9cd-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.075364 4672 generic.go:334] "Generic (PLEG): container finished" podID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerID="f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d" exitCode=0 Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.075455 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerDied","Data":"f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d"} Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.075502 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.075536 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4301988f-bce9-466c-aff1-88d33d37a9cd","Type":"ContainerDied","Data":"a284d04037fc82b693d6cf7811dc2baa7ee7acf23c85bc38fc4e0521c7872e59"} Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.075579 4672 scope.go:117] "RemoveContainer" containerID="ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.075932 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.092899 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.134169 4672 scope.go:117] "RemoveContainer" containerID="a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.180902 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.186987 4672 scope.go:117] "RemoveContainer" containerID="f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.205415 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.224769 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.226624 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-notification-agent" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.226702 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-notification-agent" Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.226805 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="sg-core" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.226861 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="sg-core" Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.226920 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="proxy-httpd" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.226972 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="proxy-httpd" Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.227042 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-central-agent" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.227107 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-central-agent" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.227366 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="sg-core" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.227426 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-notification-agent" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.227481 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="ceilometer-central-agent" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.227548 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" containerName="proxy-httpd" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.233795 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.239113 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.240836 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.241848 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.241920 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.242840 4672 scope.go:117] "RemoveContainer" containerID="4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.280903 4672 scope.go:117] "RemoveContainer" containerID="ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb" Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.282257 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb\": container with ID starting with ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb not found: ID does not exist" containerID="ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.282316 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb"} err="failed to get container status \"ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb\": rpc error: code = NotFound desc = could not find container \"ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb\": container with ID starting with ea3e9e574077158ada36b43bf98401faf2e3d0690fb9256caf14981e0fcf2dbb not found: ID does not exist" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.282343 4672 scope.go:117] "RemoveContainer" containerID="a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d" Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.282867 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d\": container with ID starting with a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d not found: ID does not exist" containerID="a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.282918 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d"} err="failed to get container status \"a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d\": rpc error: code = NotFound desc = could not find container \"a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d\": container with ID starting with a147b49ca00be20407da7f1d8104ae66a59b7b7ec4762b75d66a6ae096807c4d not found: ID does not exist" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.282936 4672 scope.go:117] "RemoveContainer" containerID="f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d" Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.283425 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d\": container with ID starting with f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d not found: ID does not exist" containerID="f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.283477 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d"} err="failed to get container status \"f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d\": rpc error: code = NotFound desc = could not find container \"f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d\": container with ID starting with f5005a50b2dd647d69fedc86c99a8aeb6afa84fb8dfb0fb273380dad6efec93d not found: ID does not exist" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.283507 4672 scope.go:117] "RemoveContainer" containerID="4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6" Sep 30 12:42:55 crc kubenswrapper[4672]: E0930 12:42:55.284257 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6\": container with ID starting with 4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6 not found: ID does not exist" containerID="4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.284356 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6"} err="failed to get container status \"4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6\": rpc error: code = NotFound desc = could not find container \"4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6\": container with ID starting with 4f71f37b27aa15dc7ac9c333178f5322ed9af961d05739baea2738149c5d62f6 not found: ID does not exist" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324381 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324438 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324514 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-log-httpd\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324540 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-config-data\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324579 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-scripts\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324604 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9bl\" (UniqueName: \"kubernetes.io/projected/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-kube-api-access-ms9bl\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324665 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.324721 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-run-httpd\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426094 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426150 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426210 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-log-httpd\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426233 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-config-data\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426284 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-scripts\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426315 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9bl\" (UniqueName: \"kubernetes.io/projected/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-kube-api-access-ms9bl\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426358 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426385 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-run-httpd\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426881 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-log-httpd\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.426983 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-run-httpd\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.431080 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.431542 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.432052 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.432177 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4301988f-bce9-466c-aff1-88d33d37a9cd" path="/var/lib/kubelet/pods/4301988f-bce9-466c-aff1-88d33d37a9cd/volumes" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.432195 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-scripts\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.433776 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-config-data\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.448306 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9bl\" (UniqueName: \"kubernetes.io/projected/daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461-kube-api-access-ms9bl\") pod \"ceilometer-0\" (UID: \"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461\") " pod="openstack/ceilometer-0" Sep 30 12:42:55 crc kubenswrapper[4672]: I0930 12:42:55.566879 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 12:42:56 crc kubenswrapper[4672]: I0930 12:42:56.034526 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 12:42:56 crc kubenswrapper[4672]: W0930 12:42:56.037798 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa9e3a4_cb9a_428e_8a81_2ed6d9e1c461.slice/crio-a83a10ea6398d0b929120f71aa35d8fbf7e2ecd04b73f1cb6867565347d605d6 WatchSource:0}: Error finding container a83a10ea6398d0b929120f71aa35d8fbf7e2ecd04b73f1cb6867565347d605d6: Status 404 returned error can't find the container with id a83a10ea6398d0b929120f71aa35d8fbf7e2ecd04b73f1cb6867565347d605d6 Sep 30 12:42:56 crc kubenswrapper[4672]: I0930 12:42:56.090638 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461","Type":"ContainerStarted","Data":"a83a10ea6398d0b929120f71aa35d8fbf7e2ecd04b73f1cb6867565347d605d6"} Sep 30 12:42:57 crc kubenswrapper[4672]: I0930 12:42:57.104019 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461","Type":"ContainerStarted","Data":"dd943e47b5580a6d3bde8310faa3b2f7c49f5104435db75a7a73e1a0b8b3735e"} Sep 30 12:42:57 crc kubenswrapper[4672]: I0930 12:42:57.104577 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461","Type":"ContainerStarted","Data":"e5cf91ec5f36a507c012df16e2abd329d7fd95eb233b60f754cbfb9269eeba6c"} Sep 30 12:42:58 crc kubenswrapper[4672]: I0930 12:42:58.116885 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461","Type":"ContainerStarted","Data":"ce95f9cea0c92baa9a3305828f0b5810206f1ff9905dac3455be771b1390b560"} Sep 30 12:43:00 crc kubenswrapper[4672]: I0930 12:43:00.151115 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461","Type":"ContainerStarted","Data":"f4f731a61c8a20bff9cd10526d6ffb8973ce7b70e0e98d815c5d649385e63013"} Sep 30 12:43:00 crc kubenswrapper[4672]: I0930 12:43:00.151701 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 12:43:00 crc kubenswrapper[4672]: I0930 12:43:00.191202 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.271802797 podStartE2EDuration="5.191180863s" podCreationTimestamp="2025-09-30 12:42:55 +0000 UTC" firstStartedPulling="2025-09-30 12:42:56.039438744 +0000 UTC m=+1267.308676430" lastFinishedPulling="2025-09-30 12:42:58.95881686 +0000 UTC m=+1270.228054496" observedRunningTime="2025-09-30 12:43:00.172377933 +0000 UTC m=+1271.441615589" watchObservedRunningTime="2025-09-30 12:43:00.191180863 +0000 UTC m=+1271.460418509" Sep 30 12:43:02 crc kubenswrapper[4672]: I0930 12:43:02.450754 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 12:43:25 crc kubenswrapper[4672]: I0930 12:43:25.581856 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 12:43:35 crc kubenswrapper[4672]: I0930 12:43:35.690236 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:43:37 crc kubenswrapper[4672]: I0930 12:43:37.447632 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:43:39 crc kubenswrapper[4672]: I0930 12:43:39.011550 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9aea18e8-190e-470a-9330-a30621c96afd" containerName="rabbitmq" containerID="cri-o://eae7ec0f02b75b0ebecb5e8cbca220342abcfd8ad46ef71cd2d79e1275c25be3" gracePeriod=604797 Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.565002 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="rabbitmq" containerID="cri-o://408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3" gracePeriod=604797 Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.605603 4672 generic.go:334] "Generic (PLEG): container finished" podID="9aea18e8-190e-470a-9330-a30621c96afd" containerID="eae7ec0f02b75b0ebecb5e8cbca220342abcfd8ad46ef71cd2d79e1275c25be3" exitCode=0 Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.605644 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aea18e8-190e-470a-9330-a30621c96afd","Type":"ContainerDied","Data":"eae7ec0f02b75b0ebecb5e8cbca220342abcfd8ad46ef71cd2d79e1275c25be3"} Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.605668 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aea18e8-190e-470a-9330-a30621c96afd","Type":"ContainerDied","Data":"a9b919759cfc32816cae2bd00d0496ee044489c2d56cad837cf6b3c3546e1e8f"} Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.605679 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b919759cfc32816cae2bd00d0496ee044489c2d56cad837cf6b3c3546e1e8f" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.670564 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771657 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-tls\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771754 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-server-conf\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771774 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-confd\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771798 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aea18e8-190e-470a-9330-a30621c96afd-pod-info\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771819 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aea18e8-190e-470a-9330-a30621c96afd-erlang-cookie-secret\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771857 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-plugins-conf\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771918 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-config-data\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.771949 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-plugins\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.772049 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-erlang-cookie\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.772075 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.772113 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzb8p\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-kube-api-access-jzb8p\") pod \"9aea18e8-190e-470a-9330-a30621c96afd\" (UID: \"9aea18e8-190e-470a-9330-a30621c96afd\") " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.773933 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.775324 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.775645 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.788513 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-kube-api-access-jzb8p" (OuterVolumeSpecName: "kube-api-access-jzb8p") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "kube-api-access-jzb8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.790681 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.792593 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.793718 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aea18e8-190e-470a-9330-a30621c96afd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.801095 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9aea18e8-190e-470a-9330-a30621c96afd-pod-info" (OuterVolumeSpecName: "pod-info") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.824343 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-config-data" (OuterVolumeSpecName: "config-data") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.869682 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-server-conf" (OuterVolumeSpecName: "server-conf") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875347 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875438 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875469 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875479 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzb8p\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-kube-api-access-jzb8p\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875490 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875499 4672 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875507 4672 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aea18e8-190e-470a-9330-a30621c96afd-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875516 4672 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aea18e8-190e-470a-9330-a30621c96afd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875524 4672 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.875532 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aea18e8-190e-470a-9330-a30621c96afd-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.908963 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.953456 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9aea18e8-190e-470a-9330-a30621c96afd" (UID: "9aea18e8-190e-470a-9330-a30621c96afd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.977101 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:40 crc kubenswrapper[4672]: I0930 12:43:40.977149 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aea18e8-190e-470a-9330-a30621c96afd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.614004 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.640830 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.650159 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.686217 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:43:41 crc kubenswrapper[4672]: E0930 12:43:41.687073 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aea18e8-190e-470a-9330-a30621c96afd" containerName="setup-container" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.687096 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aea18e8-190e-470a-9330-a30621c96afd" containerName="setup-container" Sep 30 12:43:41 crc kubenswrapper[4672]: E0930 12:43:41.687134 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aea18e8-190e-470a-9330-a30621c96afd" containerName="rabbitmq" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.687142 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aea18e8-190e-470a-9330-a30621c96afd" containerName="rabbitmq" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.687323 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aea18e8-190e-470a-9330-a30621c96afd" containerName="rabbitmq" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.688533 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.693083 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.693276 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.693378 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.693476 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.693597 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.693664 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.693758 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6fkml" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.697688 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.792503 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.792547 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.792577 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.792813 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.792963 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.793006 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-config-data\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.793065 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.793271 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.793319 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmkz9\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-kube-api-access-nmkz9\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.793360 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.793400 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.894825 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.894869 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-config-data\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.894889 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.894964 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.894992 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmkz9\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-kube-api-access-nmkz9\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.895018 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.895038 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.895099 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.895139 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.895181 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.895305 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.895505 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.896156 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.896281 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.897092 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.897120 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.899961 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-config-data\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.901215 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.905928 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.914282 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.915170 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.926659 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmkz9\" (UniqueName: \"kubernetes.io/projected/6cfe6bf3-4d65-49c0-a45b-484e53a12f80-kube-api-access-nmkz9\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:41 crc kubenswrapper[4672]: I0930 12:43:41.947285 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6cfe6bf3-4d65-49c0-a45b-484e53a12f80\") " pod="openstack/rabbitmq-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.117840 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.119656 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.205905 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d795c24-8697-461f-9322-2c23bf7cb49b-erlang-cookie-secret\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.205976 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-config-data\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206087 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scd4z\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-kube-api-access-scd4z\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206123 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d795c24-8697-461f-9322-2c23bf7cb49b-pod-info\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206192 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-tls\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206220 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-plugins\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206306 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-server-conf\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206340 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-plugins-conf\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206402 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-confd\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206455 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-erlang-cookie\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.206474 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"2d795c24-8697-461f-9322-2c23bf7cb49b\" (UID: \"2d795c24-8697-461f-9322-2c23bf7cb49b\") " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.208423 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.210611 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.210998 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.222918 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-kube-api-access-scd4z" (OuterVolumeSpecName: "kube-api-access-scd4z") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "kube-api-access-scd4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.224884 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2d795c24-8697-461f-9322-2c23bf7cb49b-pod-info" (OuterVolumeSpecName: "pod-info") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.225016 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d795c24-8697-461f-9322-2c23bf7cb49b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.244118 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.245461 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-config-data" (OuterVolumeSpecName: "config-data") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.249985 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.298608 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-server-conf" (OuterVolumeSpecName: "server-conf") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309525 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309656 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309671 4672 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309679 4672 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309688 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309737 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309749 4672 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d795c24-8697-461f-9322-2c23bf7cb49b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309757 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d795c24-8697-461f-9322-2c23bf7cb49b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309766 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scd4z\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-kube-api-access-scd4z\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.309774 4672 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d795c24-8697-461f-9322-2c23bf7cb49b-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.347062 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.363987 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2d795c24-8697-461f-9322-2c23bf7cb49b" (UID: "2d795c24-8697-461f-9322-2c23bf7cb49b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.411827 4672 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d795c24-8697-461f-9322-2c23bf7cb49b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.411868 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.663512 4672 generic.go:334] "Generic (PLEG): container finished" podID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerID="408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3" exitCode=0 Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.663603 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d795c24-8697-461f-9322-2c23bf7cb49b","Type":"ContainerDied","Data":"408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3"} Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.663795 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d795c24-8697-461f-9322-2c23bf7cb49b","Type":"ContainerDied","Data":"b07965e30279b8eb5ec39eb632aba6618fb1387e02d40da9c8b1f2760d852e29"} Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.663818 4672 scope.go:117] "RemoveContainer" containerID="408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.663653 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.709108 4672 scope.go:117] "RemoveContainer" containerID="55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.709962 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.732878 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.747458 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.758126 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:43:42 crc kubenswrapper[4672]: E0930 12:43:42.758603 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="setup-container" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.758624 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="setup-container" Sep 30 12:43:42 crc kubenswrapper[4672]: E0930 12:43:42.758633 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="rabbitmq" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.758639 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="rabbitmq" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.758862 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" containerName="rabbitmq" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.759906 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.766951 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wqzrm" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.767203 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.767397 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.767462 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.767691 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.769799 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.769980 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.781023 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.781087 4672 scope.go:117] "RemoveContainer" containerID="408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3" Sep 30 12:43:42 crc kubenswrapper[4672]: E0930 12:43:42.781626 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3\": container with ID starting with 408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3 not found: ID does not exist" containerID="408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.781654 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3"} err="failed to get container status \"408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3\": rpc error: code = NotFound desc = could not find container \"408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3\": container with ID starting with 408d9bed949a4f49230469f002255b0b6c6e6b7751ecc4e90e3987e5ecc0a7d3 not found: ID does not exist" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.781678 4672 scope.go:117] "RemoveContainer" containerID="55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9" Sep 30 12:43:42 crc kubenswrapper[4672]: E0930 12:43:42.786873 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9\": container with ID starting with 55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9 not found: ID does not exist" containerID="55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.786917 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9"} err="failed to get container status \"55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9\": rpc error: code = NotFound desc = could not find container \"55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9\": container with ID starting with 55f2020c2c1834b9ac6a092b6aa62719e415fbc93f0663463397655e3eff7cc9 not found: ID does not exist" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819146 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819359 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819421 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819479 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00b691a7-21bd-4661-9b19-cae31a79f18e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819539 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819568 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00b691a7-21bd-4661-9b19-cae31a79f18e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819766 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.819816 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.820756 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.820945 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.821083 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmc6j\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-kube-api-access-dmc6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.923233 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.923584 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmc6j\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-kube-api-access-dmc6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.923618 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.923678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.923701 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.923722 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00b691a7-21bd-4661-9b19-cae31a79f18e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.923752 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.924025 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00b691a7-21bd-4661-9b19-cae31a79f18e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.925727 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.925769 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.925805 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.926321 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.926378 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.926692 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.930214 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.931289 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00b691a7-21bd-4661-9b19-cae31a79f18e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.931738 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.935683 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.936574 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:42 crc kubenswrapper[4672]: I0930 12:43:42.968253 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00b691a7-21bd-4661-9b19-cae31a79f18e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.037695 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmc6j\" (UniqueName: \"kubernetes.io/projected/00b691a7-21bd-4661-9b19-cae31a79f18e-kube-api-access-dmc6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.041827 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00b691a7-21bd-4661-9b19-cae31a79f18e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.071224 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00b691a7-21bd-4661-9b19-cae31a79f18e\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.084381 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.428452 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d795c24-8697-461f-9322-2c23bf7cb49b" path="/var/lib/kubelet/pods/2d795c24-8697-461f-9322-2c23bf7cb49b/volumes" Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.429723 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aea18e8-190e-470a-9330-a30621c96afd" path="/var/lib/kubelet/pods/9aea18e8-190e-470a-9330-a30621c96afd/volumes" Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.552576 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 12:43:43 crc kubenswrapper[4672]: W0930 12:43:43.557990 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b691a7_21bd_4661_9b19_cae31a79f18e.slice/crio-391438d6bdbfb86dbc1aa24de77e7be8e77fe0808b001df539ddd6082b4e5c62 WatchSource:0}: Error finding container 391438d6bdbfb86dbc1aa24de77e7be8e77fe0808b001df539ddd6082b4e5c62: Status 404 returned error can't find the container with id 391438d6bdbfb86dbc1aa24de77e7be8e77fe0808b001df539ddd6082b4e5c62 Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.679311 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00b691a7-21bd-4661-9b19-cae31a79f18e","Type":"ContainerStarted","Data":"391438d6bdbfb86dbc1aa24de77e7be8e77fe0808b001df539ddd6082b4e5c62"} Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.681706 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cfe6bf3-4d65-49c0-a45b-484e53a12f80","Type":"ContainerStarted","Data":"f8bec39665929520d2b96fe6fe2160ff6a67ad8bd0ae390b99ff7e170e47f168"} Sep 30 12:43:43 crc kubenswrapper[4672]: I0930 12:43:43.681734 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cfe6bf3-4d65-49c0-a45b-484e53a12f80","Type":"ContainerStarted","Data":"67eca1f95e3cd2b4e4cd0a7ba34bd46a70cbc6595c484690ca94c512b52c2c3d"} Sep 30 12:43:44 crc kubenswrapper[4672]: I0930 12:43:44.692424 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00b691a7-21bd-4661-9b19-cae31a79f18e","Type":"ContainerStarted","Data":"1bfc6e7d25dfe87f4c3e00d83051a62f9443dc9e360f6c903b87baf2f2b1e694"} Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.615135 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d587fb4f9-tzqjx"] Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.617807 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.620166 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.634479 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d587fb4f9-tzqjx"] Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.728956 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2c9\" (UniqueName: \"kubernetes.io/projected/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-kube-api-access-mr2c9\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.729000 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.729048 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-config\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.729142 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-svc\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.729224 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.729271 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.729379 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.831009 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2c9\" (UniqueName: \"kubernetes.io/projected/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-kube-api-access-mr2c9\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.831071 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.831120 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-config\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.831148 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-svc\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.831176 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.831196 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.831240 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.832293 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.832377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.832568 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-svc\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.832574 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.832873 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-config\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.833233 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.859073 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2c9\" (UniqueName: \"kubernetes.io/projected/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-kube-api-access-mr2c9\") pod \"dnsmasq-dns-5d587fb4f9-tzqjx\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:52 crc kubenswrapper[4672]: I0930 12:43:52.968519 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:53 crc kubenswrapper[4672]: I0930 12:43:53.451018 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d587fb4f9-tzqjx"] Sep 30 12:43:53 crc kubenswrapper[4672]: I0930 12:43:53.798370 4672 generic.go:334] "Generic (PLEG): container finished" podID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerID="9ff1e9dbf663bf7fc18cdd54c28c7f22d1801b0b9e98b01d9ef204953f4ccd08" exitCode=0 Sep 30 12:43:53 crc kubenswrapper[4672]: I0930 12:43:53.798746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" event={"ID":"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7","Type":"ContainerDied","Data":"9ff1e9dbf663bf7fc18cdd54c28c7f22d1801b0b9e98b01d9ef204953f4ccd08"} Sep 30 12:43:53 crc kubenswrapper[4672]: I0930 12:43:53.798773 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" event={"ID":"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7","Type":"ContainerStarted","Data":"57b0bf5960ba3e6a778c856316278dc67aee63a8a2f8a72b06413dfb57fef2a3"} Sep 30 12:43:54 crc kubenswrapper[4672]: I0930 12:43:54.813153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" event={"ID":"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7","Type":"ContainerStarted","Data":"12cf7ba681e289da8c15f9bc88b4cfbf81d3c3179d112a5a87459c45af7fe442"} Sep 30 12:43:54 crc kubenswrapper[4672]: I0930 12:43:54.855840 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" podStartSLOduration=2.855820224 podStartE2EDuration="2.855820224s" podCreationTimestamp="2025-09-30 12:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:43:54.842168655 +0000 UTC m=+1326.111406331" watchObservedRunningTime="2025-09-30 12:43:54.855820224 +0000 UTC m=+1326.125057880" Sep 30 12:43:55 crc kubenswrapper[4672]: I0930 12:43:55.823231 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:43:58 crc kubenswrapper[4672]: I0930 12:43:58.040914 4672 scope.go:117] "RemoveContainer" containerID="a0c26de2aaefa436e0b67d8d59a37473150e386f619e455836997bfb3399389f" Sep 30 12:43:58 crc kubenswrapper[4672]: I0930 12:43:58.077305 4672 scope.go:117] "RemoveContainer" containerID="e01b88bbf1911514ef8474023e787f93d998277ea95c8dfcfe001656a8fbed44" Sep 30 12:43:58 crc kubenswrapper[4672]: I0930 12:43:58.134995 4672 scope.go:117] "RemoveContainer" containerID="f7da0813cbfb0aed147e8014aaddea2aa24a882e5500ee46152703904bf5950f" Sep 30 12:44:02 crc kubenswrapper[4672]: I0930 12:44:02.970649 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.061012 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b9c78747-bt7cg"] Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.061361 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" podUID="cbc1011f-55f1-4518-9117-215b69ae7590" containerName="dnsmasq-dns" containerID="cri-o://d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6" gracePeriod=10 Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.197976 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64bcc76c55-fkj7b"] Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.200055 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.208319 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bcc76c55-fkj7b"] Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.370534 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmhkl\" (UniqueName: \"kubernetes.io/projected/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-kube-api-access-cmhkl\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.370844 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-ovsdbserver-nb\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.370921 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-config\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.370977 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-ovsdbserver-sb\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.371009 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-openstack-edpm-ipam\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.371069 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-dns-swift-storage-0\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.371092 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-dns-svc\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.472567 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-dns-swift-storage-0\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.473456 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-dns-svc\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.473406 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-dns-swift-storage-0\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.473550 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmhkl\" (UniqueName: \"kubernetes.io/projected/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-kube-api-access-cmhkl\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.473606 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-ovsdbserver-nb\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.474056 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-config\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.474484 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-dns-svc\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.474761 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-config\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.474814 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-ovsdbserver-sb\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.474840 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-openstack-edpm-ipam\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.474844 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-ovsdbserver-nb\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.475459 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-ovsdbserver-sb\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.476074 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-openstack-edpm-ipam\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.499036 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmhkl\" (UniqueName: \"kubernetes.io/projected/cdfacf2c-b616-4c40-b16e-ec39de0d0e21-kube-api-access-cmhkl\") pod \"dnsmasq-dns-64bcc76c55-fkj7b\" (UID: \"cdfacf2c-b616-4c40-b16e-ec39de0d0e21\") " pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.587528 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.695413 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.790842 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-sb\") pod \"cbc1011f-55f1-4518-9117-215b69ae7590\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.790896 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-nb\") pod \"cbc1011f-55f1-4518-9117-215b69ae7590\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.791047 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-config\") pod \"cbc1011f-55f1-4518-9117-215b69ae7590\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.791128 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-swift-storage-0\") pod \"cbc1011f-55f1-4518-9117-215b69ae7590\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.791204 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x42qk\" (UniqueName: \"kubernetes.io/projected/cbc1011f-55f1-4518-9117-215b69ae7590-kube-api-access-x42qk\") pod \"cbc1011f-55f1-4518-9117-215b69ae7590\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.791339 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-svc\") pod \"cbc1011f-55f1-4518-9117-215b69ae7590\" (UID: \"cbc1011f-55f1-4518-9117-215b69ae7590\") " Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.802546 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc1011f-55f1-4518-9117-215b69ae7590-kube-api-access-x42qk" (OuterVolumeSpecName: "kube-api-access-x42qk") pod "cbc1011f-55f1-4518-9117-215b69ae7590" (UID: "cbc1011f-55f1-4518-9117-215b69ae7590"). InnerVolumeSpecName "kube-api-access-x42qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.881510 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cbc1011f-55f1-4518-9117-215b69ae7590" (UID: "cbc1011f-55f1-4518-9117-215b69ae7590"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.887688 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cbc1011f-55f1-4518-9117-215b69ae7590" (UID: "cbc1011f-55f1-4518-9117-215b69ae7590"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.896337 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.896376 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x42qk\" (UniqueName: \"kubernetes.io/projected/cbc1011f-55f1-4518-9117-215b69ae7590-kube-api-access-x42qk\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.896391 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.907713 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cbc1011f-55f1-4518-9117-215b69ae7590" (UID: "cbc1011f-55f1-4518-9117-215b69ae7590"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.914970 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-config" (OuterVolumeSpecName: "config") pod "cbc1011f-55f1-4518-9117-215b69ae7590" (UID: "cbc1011f-55f1-4518-9117-215b69ae7590"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.937760 4672 generic.go:334] "Generic (PLEG): container finished" podID="cbc1011f-55f1-4518-9117-215b69ae7590" containerID="d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6" exitCode=0 Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.937800 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" event={"ID":"cbc1011f-55f1-4518-9117-215b69ae7590","Type":"ContainerDied","Data":"d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6"} Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.937832 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" event={"ID":"cbc1011f-55f1-4518-9117-215b69ae7590","Type":"ContainerDied","Data":"ea5341280da5e524d64656d359a2700f361c6819b79d45b2e0e7aa41468d6b4f"} Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.937852 4672 scope.go:117] "RemoveContainer" containerID="d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.937973 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b9c78747-bt7cg" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.957990 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbc1011f-55f1-4518-9117-215b69ae7590" (UID: "cbc1011f-55f1-4518-9117-215b69ae7590"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:03 crc kubenswrapper[4672]: I0930 12:44:03.971409 4672 scope.go:117] "RemoveContainer" containerID="af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.001190 4672 scope.go:117] "RemoveContainer" containerID="d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.001201 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.001308 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.001321 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbc1011f-55f1-4518-9117-215b69ae7590-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:04 crc kubenswrapper[4672]: E0930 12:44:04.001733 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6\": container with ID starting with d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6 not found: ID does not exist" containerID="d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.001758 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6"} err="failed to get container status \"d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6\": rpc error: code = NotFound desc = could not find container \"d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6\": container with ID starting with d9ca6d0f522889de7a5d1cb71d356ae0d4954863954bbdaa86bc80571fe5dae6 not found: ID does not exist" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.001777 4672 scope.go:117] "RemoveContainer" containerID="af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c" Sep 30 12:44:04 crc kubenswrapper[4672]: E0930 12:44:04.002114 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c\": container with ID starting with af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c not found: ID does not exist" containerID="af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.002132 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c"} err="failed to get container status \"af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c\": rpc error: code = NotFound desc = could not find container \"af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c\": container with ID starting with af367e360c60e6b5545eac7e4808d4c62a5ed7f8e784811667a897709cae0d4c not found: ID does not exist" Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.071172 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bcc76c55-fkj7b"] Sep 30 12:44:04 crc kubenswrapper[4672]: W0930 12:44:04.076904 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdfacf2c_b616_4c40_b16e_ec39de0d0e21.slice/crio-bdec227e895b09a97cb2976b2b07cca9f5ec17df41c4037a93db875c587ccc46 WatchSource:0}: Error finding container bdec227e895b09a97cb2976b2b07cca9f5ec17df41c4037a93db875c587ccc46: Status 404 returned error can't find the container with id bdec227e895b09a97cb2976b2b07cca9f5ec17df41c4037a93db875c587ccc46 Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.366209 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b9c78747-bt7cg"] Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.376349 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b9c78747-bt7cg"] Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.951119 4672 generic.go:334] "Generic (PLEG): container finished" podID="cdfacf2c-b616-4c40-b16e-ec39de0d0e21" containerID="dbef28df0f5e23e74a91c21ebd09fc16b3cc280d1ff3538a4739907307d65960" exitCode=0 Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.951205 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" event={"ID":"cdfacf2c-b616-4c40-b16e-ec39de0d0e21","Type":"ContainerDied","Data":"dbef28df0f5e23e74a91c21ebd09fc16b3cc280d1ff3538a4739907307d65960"} Sep 30 12:44:04 crc kubenswrapper[4672]: I0930 12:44:04.951640 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" event={"ID":"cdfacf2c-b616-4c40-b16e-ec39de0d0e21","Type":"ContainerStarted","Data":"bdec227e895b09a97cb2976b2b07cca9f5ec17df41c4037a93db875c587ccc46"} Sep 30 12:44:05 crc kubenswrapper[4672]: I0930 12:44:05.427409 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc1011f-55f1-4518-9117-215b69ae7590" path="/var/lib/kubelet/pods/cbc1011f-55f1-4518-9117-215b69ae7590/volumes" Sep 30 12:44:05 crc kubenswrapper[4672]: I0930 12:44:05.967160 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" event={"ID":"cdfacf2c-b616-4c40-b16e-ec39de0d0e21","Type":"ContainerStarted","Data":"34afdb63a01ace7539d46ca3d1ac509ed17e9cb5f375056428ee7428ee2777c7"} Sep 30 12:44:05 crc kubenswrapper[4672]: I0930 12:44:05.967300 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:06 crc kubenswrapper[4672]: I0930 12:44:06.000069 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" podStartSLOduration=3.000042339 podStartE2EDuration="3.000042339s" podCreationTimestamp="2025-09-30 12:44:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:44:05.985059827 +0000 UTC m=+1337.254297513" watchObservedRunningTime="2025-09-30 12:44:06.000042339 +0000 UTC m=+1337.269280045" Sep 30 12:44:13 crc kubenswrapper[4672]: I0930 12:44:13.590910 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64bcc76c55-fkj7b" Sep 30 12:44:13 crc kubenswrapper[4672]: I0930 12:44:13.682936 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d587fb4f9-tzqjx"] Sep 30 12:44:13 crc kubenswrapper[4672]: I0930 12:44:13.683389 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" podUID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerName="dnsmasq-dns" containerID="cri-o://12cf7ba681e289da8c15f9bc88b4cfbf81d3c3179d112a5a87459c45af7fe442" gracePeriod=10 Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.082223 4672 generic.go:334] "Generic (PLEG): container finished" podID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerID="12cf7ba681e289da8c15f9bc88b4cfbf81d3c3179d112a5a87459c45af7fe442" exitCode=0 Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.082302 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" event={"ID":"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7","Type":"ContainerDied","Data":"12cf7ba681e289da8c15f9bc88b4cfbf81d3c3179d112a5a87459c45af7fe442"} Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.084709 4672 generic.go:334] "Generic (PLEG): container finished" podID="6cfe6bf3-4d65-49c0-a45b-484e53a12f80" containerID="f8bec39665929520d2b96fe6fe2160ff6a67ad8bd0ae390b99ff7e170e47f168" exitCode=0 Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.084753 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cfe6bf3-4d65-49c0-a45b-484e53a12f80","Type":"ContainerDied","Data":"f8bec39665929520d2b96fe6fe2160ff6a67ad8bd0ae390b99ff7e170e47f168"} Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.346843 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.423049 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-sb\") pod \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.423095 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-nb\") pod \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.423146 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-config\") pod \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.423286 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-swift-storage-0\") pod \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.423357 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr2c9\" (UniqueName: \"kubernetes.io/projected/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-kube-api-access-mr2c9\") pod \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.423398 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-openstack-edpm-ipam\") pod \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.423449 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-svc\") pod \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\" (UID: \"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7\") " Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.429995 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-kube-api-access-mr2c9" (OuterVolumeSpecName: "kube-api-access-mr2c9") pod "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" (UID: "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7"). InnerVolumeSpecName "kube-api-access-mr2c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.489786 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" (UID: "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.494694 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" (UID: "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.496722 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" (UID: "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.499255 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-config" (OuterVolumeSpecName: "config") pod "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" (UID: "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.501972 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" (UID: "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.504870 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" (UID: "cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:44:14 crc kubenswrapper[4672]: E0930 12:44:14.516902 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b691a7_21bd_4661_9b19_cae31a79f18e.slice/crio-1bfc6e7d25dfe87f4c3e00d83051a62f9443dc9e360f6c903b87baf2f2b1e694.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b691a7_21bd_4661_9b19_cae31a79f18e.slice/crio-conmon-1bfc6e7d25dfe87f4c3e00d83051a62f9443dc9e360f6c903b87baf2f2b1e694.scope\": RecentStats: unable to find data in memory cache]" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.525520 4672 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.526341 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.526360 4672 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.526369 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-config\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.526380 4672 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.526390 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr2c9\" (UniqueName: \"kubernetes.io/projected/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-kube-api-access-mr2c9\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:14 crc kubenswrapper[4672]: I0930 12:44:14.526401 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.099477 4672 generic.go:334] "Generic (PLEG): container finished" podID="00b691a7-21bd-4661-9b19-cae31a79f18e" containerID="1bfc6e7d25dfe87f4c3e00d83051a62f9443dc9e360f6c903b87baf2f2b1e694" exitCode=0 Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.099584 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00b691a7-21bd-4661-9b19-cae31a79f18e","Type":"ContainerDied","Data":"1bfc6e7d25dfe87f4c3e00d83051a62f9443dc9e360f6c903b87baf2f2b1e694"} Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.103563 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6cfe6bf3-4d65-49c0-a45b-484e53a12f80","Type":"ContainerStarted","Data":"13a212bb3123300625305c12b0b6557515a4e1133978a20339d042f23fb116aa"} Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.103821 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.106152 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" event={"ID":"cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7","Type":"ContainerDied","Data":"57b0bf5960ba3e6a778c856316278dc67aee63a8a2f8a72b06413dfb57fef2a3"} Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.106227 4672 scope.go:117] "RemoveContainer" containerID="12cf7ba681e289da8c15f9bc88b4cfbf81d3c3179d112a5a87459c45af7fe442" Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.106297 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d587fb4f9-tzqjx" Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.162372 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.162345342 podStartE2EDuration="34.162345342s" podCreationTimestamp="2025-09-30 12:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:44:15.157374875 +0000 UTC m=+1346.426612521" watchObservedRunningTime="2025-09-30 12:44:15.162345342 +0000 UTC m=+1346.431583008" Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.341221 4672 scope.go:117] "RemoveContainer" containerID="9ff1e9dbf663bf7fc18cdd54c28c7f22d1801b0b9e98b01d9ef204953f4ccd08" Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.359182 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d587fb4f9-tzqjx"] Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.369673 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d587fb4f9-tzqjx"] Sep 30 12:44:15 crc kubenswrapper[4672]: I0930 12:44:15.429153 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" path="/var/lib/kubelet/pods/cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7/volumes" Sep 30 12:44:16 crc kubenswrapper[4672]: I0930 12:44:16.120152 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00b691a7-21bd-4661-9b19-cae31a79f18e","Type":"ContainerStarted","Data":"17142122bab5e630c81c7d4bb66a4c8ee0b102ad213af4659f38bccd8fe443c4"} Sep 30 12:44:16 crc kubenswrapper[4672]: I0930 12:44:16.151184 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.151164448 podStartE2EDuration="34.151164448s" podCreationTimestamp="2025-09-30 12:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 12:44:16.144079807 +0000 UTC m=+1347.413317483" watchObservedRunningTime="2025-09-30 12:44:16.151164448 +0000 UTC m=+1347.420402104" Sep 30 12:44:23 crc kubenswrapper[4672]: I0930 12:44:23.085150 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.678307 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w"] Sep 30 12:44:31 crc kubenswrapper[4672]: E0930 12:44:31.680116 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc1011f-55f1-4518-9117-215b69ae7590" containerName="dnsmasq-dns" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.680142 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc1011f-55f1-4518-9117-215b69ae7590" containerName="dnsmasq-dns" Sep 30 12:44:31 crc kubenswrapper[4672]: E0930 12:44:31.680201 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerName="dnsmasq-dns" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.680213 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerName="dnsmasq-dns" Sep 30 12:44:31 crc kubenswrapper[4672]: E0930 12:44:31.680281 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerName="init" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.680290 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerName="init" Sep 30 12:44:31 crc kubenswrapper[4672]: E0930 12:44:31.680316 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc1011f-55f1-4518-9117-215b69ae7590" containerName="init" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.680324 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc1011f-55f1-4518-9117-215b69ae7590" containerName="init" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.680646 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc1011f-55f1-4518-9117-215b69ae7590" containerName="dnsmasq-dns" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.680671 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe6fbdd-2c7d-40ec-97a8-2a83490f25c7" containerName="dnsmasq-dns" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.682030 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.689787 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.691824 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.692059 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.692282 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.718296 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w"] Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.797909 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7tk\" (UniqueName: \"kubernetes.io/projected/9ac7c321-e380-4baa-8233-0ec24fa6496f-kube-api-access-cv7tk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.797967 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.798010 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.798048 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.900471 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7tk\" (UniqueName: \"kubernetes.io/projected/9ac7c321-e380-4baa-8233-0ec24fa6496f-kube-api-access-cv7tk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.900717 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.900809 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.900929 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.916646 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.916809 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.929964 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:31 crc kubenswrapper[4672]: I0930 12:44:31.934445 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7tk\" (UniqueName: \"kubernetes.io/projected/9ac7c321-e380-4baa-8233-0ec24fa6496f-kube-api-access-cv7tk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:32 crc kubenswrapper[4672]: I0930 12:44:32.003852 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:32 crc kubenswrapper[4672]: I0930 12:44:32.165479 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 12:44:32 crc kubenswrapper[4672]: I0930 12:44:32.666007 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w"] Sep 30 12:44:32 crc kubenswrapper[4672]: W0930 12:44:32.675326 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ac7c321_e380_4baa_8233_0ec24fa6496f.slice/crio-e32c58b6acfd03c7f40436c17197d31522fb4eca36b877fc551916964a2130ea WatchSource:0}: Error finding container e32c58b6acfd03c7f40436c17197d31522fb4eca36b877fc551916964a2130ea: Status 404 returned error can't find the container with id e32c58b6acfd03c7f40436c17197d31522fb4eca36b877fc551916964a2130ea Sep 30 12:44:33 crc kubenswrapper[4672]: I0930 12:44:33.088476 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 12:44:33 crc kubenswrapper[4672]: I0930 12:44:33.313288 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" event={"ID":"9ac7c321-e380-4baa-8233-0ec24fa6496f","Type":"ContainerStarted","Data":"e32c58b6acfd03c7f40436c17197d31522fb4eca36b877fc551916964a2130ea"} Sep 30 12:44:42 crc kubenswrapper[4672]: I0930 12:44:42.411211 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" event={"ID":"9ac7c321-e380-4baa-8233-0ec24fa6496f","Type":"ContainerStarted","Data":"e0c71a2f89a0eaf1b5227154dee38d17ceff8cca3c7629b73c3057d9d5cc7683"} Sep 30 12:44:42 crc kubenswrapper[4672]: I0930 12:44:42.438236 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" podStartSLOduration=2.794846237 podStartE2EDuration="11.438217024s" podCreationTimestamp="2025-09-30 12:44:31 +0000 UTC" firstStartedPulling="2025-09-30 12:44:32.678984122 +0000 UTC m=+1363.948221768" lastFinishedPulling="2025-09-30 12:44:41.322354899 +0000 UTC m=+1372.591592555" observedRunningTime="2025-09-30 12:44:42.427410309 +0000 UTC m=+1373.696647975" watchObservedRunningTime="2025-09-30 12:44:42.438217024 +0000 UTC m=+1373.707454680" Sep 30 12:44:53 crc kubenswrapper[4672]: I0930 12:44:53.536734 4672 generic.go:334] "Generic (PLEG): container finished" podID="9ac7c321-e380-4baa-8233-0ec24fa6496f" containerID="e0c71a2f89a0eaf1b5227154dee38d17ceff8cca3c7629b73c3057d9d5cc7683" exitCode=0 Sep 30 12:44:53 crc kubenswrapper[4672]: I0930 12:44:53.537120 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" event={"ID":"9ac7c321-e380-4baa-8233-0ec24fa6496f","Type":"ContainerDied","Data":"e0c71a2f89a0eaf1b5227154dee38d17ceff8cca3c7629b73c3057d9d5cc7683"} Sep 30 12:44:54 crc kubenswrapper[4672]: I0930 12:44:54.739851 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:44:54 crc kubenswrapper[4672]: I0930 12:44:54.740225 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.023984 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.072233 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-ssh-key\") pod \"9ac7c321-e380-4baa-8233-0ec24fa6496f\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.072435 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv7tk\" (UniqueName: \"kubernetes.io/projected/9ac7c321-e380-4baa-8233-0ec24fa6496f-kube-api-access-cv7tk\") pod \"9ac7c321-e380-4baa-8233-0ec24fa6496f\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.072661 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-repo-setup-combined-ca-bundle\") pod \"9ac7c321-e380-4baa-8233-0ec24fa6496f\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.073691 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-inventory\") pod \"9ac7c321-e380-4baa-8233-0ec24fa6496f\" (UID: \"9ac7c321-e380-4baa-8233-0ec24fa6496f\") " Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.079296 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac7c321-e380-4baa-8233-0ec24fa6496f-kube-api-access-cv7tk" (OuterVolumeSpecName: "kube-api-access-cv7tk") pod "9ac7c321-e380-4baa-8233-0ec24fa6496f" (UID: "9ac7c321-e380-4baa-8233-0ec24fa6496f"). InnerVolumeSpecName "kube-api-access-cv7tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.083578 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9ac7c321-e380-4baa-8233-0ec24fa6496f" (UID: "9ac7c321-e380-4baa-8233-0ec24fa6496f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.104571 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ac7c321-e380-4baa-8233-0ec24fa6496f" (UID: "9ac7c321-e380-4baa-8233-0ec24fa6496f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.134709 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-inventory" (OuterVolumeSpecName: "inventory") pod "9ac7c321-e380-4baa-8233-0ec24fa6496f" (UID: "9ac7c321-e380-4baa-8233-0ec24fa6496f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.176554 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv7tk\" (UniqueName: \"kubernetes.io/projected/9ac7c321-e380-4baa-8233-0ec24fa6496f-kube-api-access-cv7tk\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.176596 4672 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.176624 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.176640 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ac7c321-e380-4baa-8233-0ec24fa6496f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.556594 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" event={"ID":"9ac7c321-e380-4baa-8233-0ec24fa6496f","Type":"ContainerDied","Data":"e32c58b6acfd03c7f40436c17197d31522fb4eca36b877fc551916964a2130ea"} Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.556970 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e32c58b6acfd03c7f40436c17197d31522fb4eca36b877fc551916964a2130ea" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.556658 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.646577 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk"] Sep 30 12:44:55 crc kubenswrapper[4672]: E0930 12:44:55.647029 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac7c321-e380-4baa-8233-0ec24fa6496f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.647043 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac7c321-e380-4baa-8233-0ec24fa6496f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.647291 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac7c321-e380-4baa-8233-0ec24fa6496f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.647980 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.651431 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.651583 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.651737 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.652122 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.673933 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk"] Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.689353 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.689405 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl6gk\" (UniqueName: \"kubernetes.io/projected/c67e6191-fd96-4caf-a6fd-6d5a7013f069-kube-api-access-jl6gk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.689490 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.790622 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.790733 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.790789 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl6gk\" (UniqueName: \"kubernetes.io/projected/c67e6191-fd96-4caf-a6fd-6d5a7013f069-kube-api-access-jl6gk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.795006 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.804840 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.808951 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl6gk\" (UniqueName: \"kubernetes.io/projected/c67e6191-fd96-4caf-a6fd-6d5a7013f069-kube-api-access-jl6gk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kz9kk\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:55 crc kubenswrapper[4672]: I0930 12:44:55.978703 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:44:56 crc kubenswrapper[4672]: I0930 12:44:56.520841 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk"] Sep 30 12:44:56 crc kubenswrapper[4672]: I0930 12:44:56.571368 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" event={"ID":"c67e6191-fd96-4caf-a6fd-6d5a7013f069","Type":"ContainerStarted","Data":"8f5fa8f6ead78d4da388ac172a58772bc1242e1c62f92fa5df768eccd103e11a"} Sep 30 12:44:57 crc kubenswrapper[4672]: I0930 12:44:57.581585 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" event={"ID":"c67e6191-fd96-4caf-a6fd-6d5a7013f069","Type":"ContainerStarted","Data":"289aa5eed8f9f9f0571c0f51aa5cd2376f86f535956f2b8934e73adb68499257"} Sep 30 12:44:58 crc kubenswrapper[4672]: I0930 12:44:58.235181 4672 scope.go:117] "RemoveContainer" containerID="ea25e628b0692f4a89ec401553909cda0a636e4beb6528d5202d826964a1948d" Sep 30 12:44:58 crc kubenswrapper[4672]: I0930 12:44:58.262498 4672 scope.go:117] "RemoveContainer" containerID="f0a4d1c0737f0f60f37572beb59a93b16e9d23c9eca3776125c61425326c05f1" Sep 30 12:44:58 crc kubenswrapper[4672]: I0930 12:44:58.315088 4672 scope.go:117] "RemoveContainer" containerID="48aa12cd375c302528a88918d0cb09ef65b3c49af123b8758dbbb074de3bae10" Sep 30 12:44:58 crc kubenswrapper[4672]: I0930 12:44:58.359159 4672 scope.go:117] "RemoveContainer" containerID="3a6ec9c5597a94e6f7c8b9a5c79734eaf39c98e33f98fbae6974f40305f31127" Sep 30 12:44:58 crc kubenswrapper[4672]: I0930 12:44:58.402484 4672 scope.go:117] "RemoveContainer" containerID="c23062b57a4250ac477935173cb6801b75a51ef6ea9a95a8732a9b350236a917" Sep 30 12:44:58 crc kubenswrapper[4672]: I0930 12:44:58.436691 4672 scope.go:117] "RemoveContainer" containerID="eae7ec0f02b75b0ebecb5e8cbca220342abcfd8ad46ef71cd2d79e1275c25be3" Sep 30 12:44:58 crc kubenswrapper[4672]: I0930 12:44:58.487147 4672 scope.go:117] "RemoveContainer" containerID="ee790b61045bb54e632c53cbc9891b79507a6f4accac1b1ff2d861ef334c97b5" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.133885 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" podStartSLOduration=4.591757276 podStartE2EDuration="5.1338626s" podCreationTimestamp="2025-09-30 12:44:55 +0000 UTC" firstStartedPulling="2025-09-30 12:44:56.538801745 +0000 UTC m=+1387.808039391" lastFinishedPulling="2025-09-30 12:44:57.080907069 +0000 UTC m=+1388.350144715" observedRunningTime="2025-09-30 12:44:57.610528336 +0000 UTC m=+1388.879765982" watchObservedRunningTime="2025-09-30 12:45:00.1338626 +0000 UTC m=+1391.403100246" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.141594 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd"] Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.143028 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.145380 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.145517 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.151740 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd"] Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.203020 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbx8d\" (UniqueName: \"kubernetes.io/projected/854490c3-8e67-4668-a14f-8af3d1b0a8f5-kube-api-access-vbx8d\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.203106 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854490c3-8e67-4668-a14f-8af3d1b0a8f5-secret-volume\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.203208 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854490c3-8e67-4668-a14f-8af3d1b0a8f5-config-volume\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.305468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbx8d\" (UniqueName: \"kubernetes.io/projected/854490c3-8e67-4668-a14f-8af3d1b0a8f5-kube-api-access-vbx8d\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.305536 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854490c3-8e67-4668-a14f-8af3d1b0a8f5-secret-volume\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.305641 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854490c3-8e67-4668-a14f-8af3d1b0a8f5-config-volume\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.306941 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854490c3-8e67-4668-a14f-8af3d1b0a8f5-config-volume\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.313873 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854490c3-8e67-4668-a14f-8af3d1b0a8f5-secret-volume\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.323289 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbx8d\" (UniqueName: \"kubernetes.io/projected/854490c3-8e67-4668-a14f-8af3d1b0a8f5-kube-api-access-vbx8d\") pod \"collect-profiles-29320605-4j4hd\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.479361 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.630303 4672 generic.go:334] "Generic (PLEG): container finished" podID="c67e6191-fd96-4caf-a6fd-6d5a7013f069" containerID="289aa5eed8f9f9f0571c0f51aa5cd2376f86f535956f2b8934e73adb68499257" exitCode=0 Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.630689 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" event={"ID":"c67e6191-fd96-4caf-a6fd-6d5a7013f069","Type":"ContainerDied","Data":"289aa5eed8f9f9f0571c0f51aa5cd2376f86f535956f2b8934e73adb68499257"} Sep 30 12:45:00 crc kubenswrapper[4672]: I0930 12:45:00.941940 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd"] Sep 30 12:45:00 crc kubenswrapper[4672]: W0930 12:45:00.944136 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854490c3_8e67_4668_a14f_8af3d1b0a8f5.slice/crio-578b41a094871d53714d59a00b5f1a1cc4f165e03a64b9a7a9689750e745c093 WatchSource:0}: Error finding container 578b41a094871d53714d59a00b5f1a1cc4f165e03a64b9a7a9689750e745c093: Status 404 returned error can't find the container with id 578b41a094871d53714d59a00b5f1a1cc4f165e03a64b9a7a9689750e745c093 Sep 30 12:45:01 crc kubenswrapper[4672]: I0930 12:45:01.641360 4672 generic.go:334] "Generic (PLEG): container finished" podID="854490c3-8e67-4668-a14f-8af3d1b0a8f5" containerID="bb76c886b5e0e0e89d1deed8a5bcbbdcf270e0c6c575c973f3aa57489e481abc" exitCode=0 Sep 30 12:45:01 crc kubenswrapper[4672]: I0930 12:45:01.641504 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" event={"ID":"854490c3-8e67-4668-a14f-8af3d1b0a8f5","Type":"ContainerDied","Data":"bb76c886b5e0e0e89d1deed8a5bcbbdcf270e0c6c575c973f3aa57489e481abc"} Sep 30 12:45:01 crc kubenswrapper[4672]: I0930 12:45:01.641730 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" event={"ID":"854490c3-8e67-4668-a14f-8af3d1b0a8f5","Type":"ContainerStarted","Data":"578b41a094871d53714d59a00b5f1a1cc4f165e03a64b9a7a9689750e745c093"} Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.142272 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.248046 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-ssh-key\") pod \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.248151 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl6gk\" (UniqueName: \"kubernetes.io/projected/c67e6191-fd96-4caf-a6fd-6d5a7013f069-kube-api-access-jl6gk\") pod \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.248256 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-inventory\") pod \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\" (UID: \"c67e6191-fd96-4caf-a6fd-6d5a7013f069\") " Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.253836 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c67e6191-fd96-4caf-a6fd-6d5a7013f069-kube-api-access-jl6gk" (OuterVolumeSpecName: "kube-api-access-jl6gk") pod "c67e6191-fd96-4caf-a6fd-6d5a7013f069" (UID: "c67e6191-fd96-4caf-a6fd-6d5a7013f069"). InnerVolumeSpecName "kube-api-access-jl6gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.277434 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-inventory" (OuterVolumeSpecName: "inventory") pod "c67e6191-fd96-4caf-a6fd-6d5a7013f069" (UID: "c67e6191-fd96-4caf-a6fd-6d5a7013f069"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.291381 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c67e6191-fd96-4caf-a6fd-6d5a7013f069" (UID: "c67e6191-fd96-4caf-a6fd-6d5a7013f069"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.350370 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.350405 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl6gk\" (UniqueName: \"kubernetes.io/projected/c67e6191-fd96-4caf-a6fd-6d5a7013f069-kube-api-access-jl6gk\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.350419 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c67e6191-fd96-4caf-a6fd-6d5a7013f069-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.651702 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" event={"ID":"c67e6191-fd96-4caf-a6fd-6d5a7013f069","Type":"ContainerDied","Data":"8f5fa8f6ead78d4da388ac172a58772bc1242e1c62f92fa5df768eccd103e11a"} Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.651757 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f5fa8f6ead78d4da388ac172a58772bc1242e1c62f92fa5df768eccd103e11a" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.651724 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kz9kk" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.734439 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8"] Sep 30 12:45:02 crc kubenswrapper[4672]: E0930 12:45:02.735147 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67e6191-fd96-4caf-a6fd-6d5a7013f069" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.735164 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67e6191-fd96-4caf-a6fd-6d5a7013f069" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.735357 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67e6191-fd96-4caf-a6fd-6d5a7013f069" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.738514 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.742935 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.743329 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.743553 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.743748 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.747309 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8"] Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.757533 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.757672 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.757757 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpt5\" (UniqueName: \"kubernetes.io/projected/91de1b76-2b84-4d21-9683-d7aee98fb876-kube-api-access-8vpt5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.757953 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.860092 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.860178 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.860228 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpt5\" (UniqueName: \"kubernetes.io/projected/91de1b76-2b84-4d21-9683-d7aee98fb876-kube-api-access-8vpt5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.860309 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.866691 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.866704 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.882833 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.885979 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpt5\" (UniqueName: \"kubernetes.io/projected/91de1b76-2b84-4d21-9683-d7aee98fb876-kube-api-access-8vpt5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:02 crc kubenswrapper[4672]: I0930 12:45:02.956300 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.063730 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbx8d\" (UniqueName: \"kubernetes.io/projected/854490c3-8e67-4668-a14f-8af3d1b0a8f5-kube-api-access-vbx8d\") pod \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.063869 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854490c3-8e67-4668-a14f-8af3d1b0a8f5-config-volume\") pod \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.063943 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854490c3-8e67-4668-a14f-8af3d1b0a8f5-secret-volume\") pod \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\" (UID: \"854490c3-8e67-4668-a14f-8af3d1b0a8f5\") " Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.064614 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854490c3-8e67-4668-a14f-8af3d1b0a8f5-config-volume" (OuterVolumeSpecName: "config-volume") pod "854490c3-8e67-4668-a14f-8af3d1b0a8f5" (UID: "854490c3-8e67-4668-a14f-8af3d1b0a8f5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.067305 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854490c3-8e67-4668-a14f-8af3d1b0a8f5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "854490c3-8e67-4668-a14f-8af3d1b0a8f5" (UID: "854490c3-8e67-4668-a14f-8af3d1b0a8f5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.067365 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854490c3-8e67-4668-a14f-8af3d1b0a8f5-kube-api-access-vbx8d" (OuterVolumeSpecName: "kube-api-access-vbx8d") pod "854490c3-8e67-4668-a14f-8af3d1b0a8f5" (UID: "854490c3-8e67-4668-a14f-8af3d1b0a8f5"). InnerVolumeSpecName "kube-api-access-vbx8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.069844 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.174810 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbx8d\" (UniqueName: \"kubernetes.io/projected/854490c3-8e67-4668-a14f-8af3d1b0a8f5-kube-api-access-vbx8d\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.175173 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854490c3-8e67-4668-a14f-8af3d1b0a8f5-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.175188 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854490c3-8e67-4668-a14f-8af3d1b0a8f5-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.577692 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8"] Sep 30 12:45:03 crc kubenswrapper[4672]: W0930 12:45:03.581415 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91de1b76_2b84_4d21_9683_d7aee98fb876.slice/crio-7d29f5825fd5ac6d5168ffca2383e3630ab378f911b2d144c517619e41c1fed8 WatchSource:0}: Error finding container 7d29f5825fd5ac6d5168ffca2383e3630ab378f911b2d144c517619e41c1fed8: Status 404 returned error can't find the container with id 7d29f5825fd5ac6d5168ffca2383e3630ab378f911b2d144c517619e41c1fed8 Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.661219 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" event={"ID":"91de1b76-2b84-4d21-9683-d7aee98fb876","Type":"ContainerStarted","Data":"7d29f5825fd5ac6d5168ffca2383e3630ab378f911b2d144c517619e41c1fed8"} Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.662708 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" event={"ID":"854490c3-8e67-4668-a14f-8af3d1b0a8f5","Type":"ContainerDied","Data":"578b41a094871d53714d59a00b5f1a1cc4f165e03a64b9a7a9689750e745c093"} Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.662733 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="578b41a094871d53714d59a00b5f1a1cc4f165e03a64b9a7a9689750e745c093" Sep 30 12:45:03 crc kubenswrapper[4672]: I0930 12:45:03.662778 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd" Sep 30 12:45:04 crc kubenswrapper[4672]: I0930 12:45:04.674654 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" event={"ID":"91de1b76-2b84-4d21-9683-d7aee98fb876","Type":"ContainerStarted","Data":"03d5fc4b8e4688e768fb7b8785d883df0c4b533b6dc8fd37542cbe6c9c82789f"} Sep 30 12:45:04 crc kubenswrapper[4672]: I0930 12:45:04.700675 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" podStartSLOduration=2.240572175 podStartE2EDuration="2.700657181s" podCreationTimestamp="2025-09-30 12:45:02 +0000 UTC" firstStartedPulling="2025-09-30 12:45:03.58375295 +0000 UTC m=+1394.852990596" lastFinishedPulling="2025-09-30 12:45:04.043837956 +0000 UTC m=+1395.313075602" observedRunningTime="2025-09-30 12:45:04.697885201 +0000 UTC m=+1395.967122857" watchObservedRunningTime="2025-09-30 12:45:04.700657181 +0000 UTC m=+1395.969894837" Sep 30 12:45:24 crc kubenswrapper[4672]: I0930 12:45:24.739750 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:45:24 crc kubenswrapper[4672]: I0930 12:45:24.740460 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.505821 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5p7gl"] Sep 30 12:45:26 crc kubenswrapper[4672]: E0930 12:45:26.507000 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854490c3-8e67-4668-a14f-8af3d1b0a8f5" containerName="collect-profiles" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.507021 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="854490c3-8e67-4668-a14f-8af3d1b0a8f5" containerName="collect-profiles" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.507445 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="854490c3-8e67-4668-a14f-8af3d1b0a8f5" containerName="collect-profiles" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.509753 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.516967 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5p7gl"] Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.652241 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5lp9\" (UniqueName: \"kubernetes.io/projected/52c5051e-4029-463d-82ef-62faf0ae3ec9-kube-api-access-x5lp9\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.652724 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-utilities\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.652781 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-catalog-content\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.754606 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-utilities\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.754679 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-catalog-content\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.754826 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5lp9\" (UniqueName: \"kubernetes.io/projected/52c5051e-4029-463d-82ef-62faf0ae3ec9-kube-api-access-x5lp9\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.755213 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-utilities\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.755232 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-catalog-content\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.776142 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5lp9\" (UniqueName: \"kubernetes.io/projected/52c5051e-4029-463d-82ef-62faf0ae3ec9-kube-api-access-x5lp9\") pod \"redhat-operators-5p7gl\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:26 crc kubenswrapper[4672]: I0930 12:45:26.838679 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:27 crc kubenswrapper[4672]: I0930 12:45:27.321998 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5p7gl"] Sep 30 12:45:27 crc kubenswrapper[4672]: I0930 12:45:27.915982 4672 generic.go:334] "Generic (PLEG): container finished" podID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerID="380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf" exitCode=0 Sep 30 12:45:27 crc kubenswrapper[4672]: I0930 12:45:27.916095 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5p7gl" event={"ID":"52c5051e-4029-463d-82ef-62faf0ae3ec9","Type":"ContainerDied","Data":"380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf"} Sep 30 12:45:27 crc kubenswrapper[4672]: I0930 12:45:27.916448 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5p7gl" event={"ID":"52c5051e-4029-463d-82ef-62faf0ae3ec9","Type":"ContainerStarted","Data":"f247abe89d90b837c30d0c1b14df13e5a9670a47ce1fd2eb5033ed3c9354447e"} Sep 30 12:45:28 crc kubenswrapper[4672]: I0930 12:45:28.931766 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5p7gl" event={"ID":"52c5051e-4029-463d-82ef-62faf0ae3ec9","Type":"ContainerStarted","Data":"f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5"} Sep 30 12:45:30 crc kubenswrapper[4672]: I0930 12:45:30.953539 4672 generic.go:334] "Generic (PLEG): container finished" podID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerID="f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5" exitCode=0 Sep 30 12:45:30 crc kubenswrapper[4672]: I0930 12:45:30.953605 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5p7gl" event={"ID":"52c5051e-4029-463d-82ef-62faf0ae3ec9","Type":"ContainerDied","Data":"f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5"} Sep 30 12:45:31 crc kubenswrapper[4672]: I0930 12:45:31.965292 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5p7gl" event={"ID":"52c5051e-4029-463d-82ef-62faf0ae3ec9","Type":"ContainerStarted","Data":"228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7"} Sep 30 12:45:31 crc kubenswrapper[4672]: I0930 12:45:31.995028 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5p7gl" podStartSLOduration=2.377918045 podStartE2EDuration="5.995006042s" podCreationTimestamp="2025-09-30 12:45:26 +0000 UTC" firstStartedPulling="2025-09-30 12:45:27.917675155 +0000 UTC m=+1419.186912811" lastFinishedPulling="2025-09-30 12:45:31.534763162 +0000 UTC m=+1422.804000808" observedRunningTime="2025-09-30 12:45:31.985088269 +0000 UTC m=+1423.254325935" watchObservedRunningTime="2025-09-30 12:45:31.995006042 +0000 UTC m=+1423.264243688" Sep 30 12:45:36 crc kubenswrapper[4672]: I0930 12:45:36.838796 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:36 crc kubenswrapper[4672]: I0930 12:45:36.839500 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:36 crc kubenswrapper[4672]: I0930 12:45:36.900079 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:37 crc kubenswrapper[4672]: I0930 12:45:37.060129 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:37 crc kubenswrapper[4672]: I0930 12:45:37.133891 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5p7gl"] Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.032428 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5p7gl" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="registry-server" containerID="cri-o://228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7" gracePeriod=2 Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.541251 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.746428 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-catalog-content\") pod \"52c5051e-4029-463d-82ef-62faf0ae3ec9\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.746785 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5lp9\" (UniqueName: \"kubernetes.io/projected/52c5051e-4029-463d-82ef-62faf0ae3ec9-kube-api-access-x5lp9\") pod \"52c5051e-4029-463d-82ef-62faf0ae3ec9\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.746925 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-utilities\") pod \"52c5051e-4029-463d-82ef-62faf0ae3ec9\" (UID: \"52c5051e-4029-463d-82ef-62faf0ae3ec9\") " Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.747831 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-utilities" (OuterVolumeSpecName: "utilities") pod "52c5051e-4029-463d-82ef-62faf0ae3ec9" (UID: "52c5051e-4029-463d-82ef-62faf0ae3ec9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.753415 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c5051e-4029-463d-82ef-62faf0ae3ec9-kube-api-access-x5lp9" (OuterVolumeSpecName: "kube-api-access-x5lp9") pod "52c5051e-4029-463d-82ef-62faf0ae3ec9" (UID: "52c5051e-4029-463d-82ef-62faf0ae3ec9"). InnerVolumeSpecName "kube-api-access-x5lp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.832026 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52c5051e-4029-463d-82ef-62faf0ae3ec9" (UID: "52c5051e-4029-463d-82ef-62faf0ae3ec9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.849022 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.849232 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52c5051e-4029-463d-82ef-62faf0ae3ec9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:39 crc kubenswrapper[4672]: I0930 12:45:39.849336 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5lp9\" (UniqueName: \"kubernetes.io/projected/52c5051e-4029-463d-82ef-62faf0ae3ec9-kube-api-access-x5lp9\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.043646 4672 generic.go:334] "Generic (PLEG): container finished" podID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerID="228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7" exitCode=0 Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.043694 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5p7gl" event={"ID":"52c5051e-4029-463d-82ef-62faf0ae3ec9","Type":"ContainerDied","Data":"228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7"} Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.043722 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5p7gl" event={"ID":"52c5051e-4029-463d-82ef-62faf0ae3ec9","Type":"ContainerDied","Data":"f247abe89d90b837c30d0c1b14df13e5a9670a47ce1fd2eb5033ed3c9354447e"} Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.043721 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5p7gl" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.043737 4672 scope.go:117] "RemoveContainer" containerID="228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.083774 4672 scope.go:117] "RemoveContainer" containerID="f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.089809 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5p7gl"] Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.116964 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5p7gl"] Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.118417 4672 scope.go:117] "RemoveContainer" containerID="380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.160936 4672 scope.go:117] "RemoveContainer" containerID="228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7" Sep 30 12:45:40 crc kubenswrapper[4672]: E0930 12:45:40.161412 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7\": container with ID starting with 228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7 not found: ID does not exist" containerID="228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.161468 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7"} err="failed to get container status \"228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7\": rpc error: code = NotFound desc = could not find container \"228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7\": container with ID starting with 228261eeaf7c17287b16b538bdc367d60220cf2f411c4ebd07b34efe6e2d5cd7 not found: ID does not exist" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.161500 4672 scope.go:117] "RemoveContainer" containerID="f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5" Sep 30 12:45:40 crc kubenswrapper[4672]: E0930 12:45:40.161856 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5\": container with ID starting with f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5 not found: ID does not exist" containerID="f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.161883 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5"} err="failed to get container status \"f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5\": rpc error: code = NotFound desc = could not find container \"f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5\": container with ID starting with f40a150f510ac33f6fa0a71122a370bce1279ecea0b0267b39750ceb0c3e02b5 not found: ID does not exist" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.161899 4672 scope.go:117] "RemoveContainer" containerID="380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf" Sep 30 12:45:40 crc kubenswrapper[4672]: E0930 12:45:40.162101 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf\": container with ID starting with 380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf not found: ID does not exist" containerID="380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf" Sep 30 12:45:40 crc kubenswrapper[4672]: I0930 12:45:40.162130 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf"} err="failed to get container status \"380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf\": rpc error: code = NotFound desc = could not find container \"380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf\": container with ID starting with 380cb43321d5d09180983e532298bf80f7d96792720e1df8de95030147127adf not found: ID does not exist" Sep 30 12:45:41 crc kubenswrapper[4672]: I0930 12:45:41.428174 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" path="/var/lib/kubelet/pods/52c5051e-4029-463d-82ef-62faf0ae3ec9/volumes" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.559336 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9c7dk"] Sep 30 12:45:43 crc kubenswrapper[4672]: E0930 12:45:43.560311 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="extract-content" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.560334 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="extract-content" Sep 30 12:45:43 crc kubenswrapper[4672]: E0930 12:45:43.560364 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="registry-server" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.560378 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="registry-server" Sep 30 12:45:43 crc kubenswrapper[4672]: E0930 12:45:43.560407 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="extract-utilities" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.560420 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="extract-utilities" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.560810 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c5051e-4029-463d-82ef-62faf0ae3ec9" containerName="registry-server" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.565187 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.584401 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9c7dk"] Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.727634 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-utilities\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.727732 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-catalog-content\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.727799 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmss\" (UniqueName: \"kubernetes.io/projected/65be141c-c319-4770-ad98-eac19d852700-kube-api-access-ntmss\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.829764 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-utilities\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.830071 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-catalog-content\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.830177 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmss\" (UniqueName: \"kubernetes.io/projected/65be141c-c319-4770-ad98-eac19d852700-kube-api-access-ntmss\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.830427 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-utilities\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.830597 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-catalog-content\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.869208 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmss\" (UniqueName: \"kubernetes.io/projected/65be141c-c319-4770-ad98-eac19d852700-kube-api-access-ntmss\") pod \"certified-operators-9c7dk\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:43 crc kubenswrapper[4672]: I0930 12:45:43.907199 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:44 crc kubenswrapper[4672]: I0930 12:45:44.482080 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9c7dk"] Sep 30 12:45:45 crc kubenswrapper[4672]: I0930 12:45:45.107071 4672 generic.go:334] "Generic (PLEG): container finished" podID="65be141c-c319-4770-ad98-eac19d852700" containerID="65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809" exitCode=0 Sep 30 12:45:45 crc kubenswrapper[4672]: I0930 12:45:45.107166 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c7dk" event={"ID":"65be141c-c319-4770-ad98-eac19d852700","Type":"ContainerDied","Data":"65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809"} Sep 30 12:45:45 crc kubenswrapper[4672]: I0930 12:45:45.108489 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c7dk" event={"ID":"65be141c-c319-4770-ad98-eac19d852700","Type":"ContainerStarted","Data":"b4d2c43241246cded5e568f2930e22f7a2ca5284394436d123e778f1dbe02640"} Sep 30 12:45:46 crc kubenswrapper[4672]: I0930 12:45:46.120683 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c7dk" event={"ID":"65be141c-c319-4770-ad98-eac19d852700","Type":"ContainerStarted","Data":"12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424"} Sep 30 12:45:47 crc kubenswrapper[4672]: I0930 12:45:47.133679 4672 generic.go:334] "Generic (PLEG): container finished" podID="65be141c-c319-4770-ad98-eac19d852700" containerID="12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424" exitCode=0 Sep 30 12:45:47 crc kubenswrapper[4672]: I0930 12:45:47.133761 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c7dk" event={"ID":"65be141c-c319-4770-ad98-eac19d852700","Type":"ContainerDied","Data":"12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424"} Sep 30 12:45:48 crc kubenswrapper[4672]: I0930 12:45:48.146535 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c7dk" event={"ID":"65be141c-c319-4770-ad98-eac19d852700","Type":"ContainerStarted","Data":"6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd"} Sep 30 12:45:48 crc kubenswrapper[4672]: I0930 12:45:48.167967 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9c7dk" podStartSLOduration=2.60972162 podStartE2EDuration="5.167947414s" podCreationTimestamp="2025-09-30 12:45:43 +0000 UTC" firstStartedPulling="2025-09-30 12:45:45.109431891 +0000 UTC m=+1436.378669547" lastFinishedPulling="2025-09-30 12:45:47.667657695 +0000 UTC m=+1438.936895341" observedRunningTime="2025-09-30 12:45:48.162855874 +0000 UTC m=+1439.432093520" watchObservedRunningTime="2025-09-30 12:45:48.167947414 +0000 UTC m=+1439.437185060" Sep 30 12:45:53 crc kubenswrapper[4672]: I0930 12:45:53.908338 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:53 crc kubenswrapper[4672]: I0930 12:45:53.909229 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:53 crc kubenswrapper[4672]: I0930 12:45:53.988197 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:54 crc kubenswrapper[4672]: I0930 12:45:54.251696 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:54 crc kubenswrapper[4672]: I0930 12:45:54.299833 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9c7dk"] Sep 30 12:45:54 crc kubenswrapper[4672]: I0930 12:45:54.740108 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:45:54 crc kubenswrapper[4672]: I0930 12:45:54.740466 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:45:54 crc kubenswrapper[4672]: I0930 12:45:54.740706 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:45:54 crc kubenswrapper[4672]: I0930 12:45:54.741446 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dae9c6261457b92ffd63bbf3a8283d8d78ab2aabf36d75dffc0e7ed0851a4fe4"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:45:54 crc kubenswrapper[4672]: I0930 12:45:54.741605 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://dae9c6261457b92ffd63bbf3a8283d8d78ab2aabf36d75dffc0e7ed0851a4fe4" gracePeriod=600 Sep 30 12:45:55 crc kubenswrapper[4672]: I0930 12:45:55.223172 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="dae9c6261457b92ffd63bbf3a8283d8d78ab2aabf36d75dffc0e7ed0851a4fe4" exitCode=0 Sep 30 12:45:55 crc kubenswrapper[4672]: I0930 12:45:55.223362 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"dae9c6261457b92ffd63bbf3a8283d8d78ab2aabf36d75dffc0e7ed0851a4fe4"} Sep 30 12:45:55 crc kubenswrapper[4672]: I0930 12:45:55.223650 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e"} Sep 30 12:45:55 crc kubenswrapper[4672]: I0930 12:45:55.223682 4672 scope.go:117] "RemoveContainer" containerID="c2e0fa5817adc74311f8929edf2f7fe8a5d38b2926c430c80278916d7abc9d3a" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.235126 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9c7dk" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="registry-server" containerID="cri-o://6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd" gracePeriod=2 Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.655922 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fd2nc"] Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.659983 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.674031 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd2nc"] Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.808807 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.823690 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxbf\" (UniqueName: \"kubernetes.io/projected/c1411591-c321-494b-b3d6-514addfe9b2f-kube-api-access-mbxbf\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.823900 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-utilities\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.823936 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-catalog-content\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.924913 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-catalog-content\") pod \"65be141c-c319-4770-ad98-eac19d852700\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.925099 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-utilities\") pod \"65be141c-c319-4770-ad98-eac19d852700\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.925128 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntmss\" (UniqueName: \"kubernetes.io/projected/65be141c-c319-4770-ad98-eac19d852700-kube-api-access-ntmss\") pod \"65be141c-c319-4770-ad98-eac19d852700\" (UID: \"65be141c-c319-4770-ad98-eac19d852700\") " Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.925461 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-utilities\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.925498 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-catalog-content\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.925592 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxbf\" (UniqueName: \"kubernetes.io/projected/c1411591-c321-494b-b3d6-514addfe9b2f-kube-api-access-mbxbf\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.925953 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-utilities" (OuterVolumeSpecName: "utilities") pod "65be141c-c319-4770-ad98-eac19d852700" (UID: "65be141c-c319-4770-ad98-eac19d852700"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.926055 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-utilities\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.926377 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-catalog-content\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.947994 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65be141c-c319-4770-ad98-eac19d852700-kube-api-access-ntmss" (OuterVolumeSpecName: "kube-api-access-ntmss") pod "65be141c-c319-4770-ad98-eac19d852700" (UID: "65be141c-c319-4770-ad98-eac19d852700"). InnerVolumeSpecName "kube-api-access-ntmss". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.952616 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxbf\" (UniqueName: \"kubernetes.io/projected/c1411591-c321-494b-b3d6-514addfe9b2f-kube-api-access-mbxbf\") pod \"redhat-marketplace-fd2nc\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.972354 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65be141c-c319-4770-ad98-eac19d852700" (UID: "65be141c-c319-4770-ad98-eac19d852700"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:45:56 crc kubenswrapper[4672]: I0930 12:45:56.992682 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.027432 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.027462 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65be141c-c319-4770-ad98-eac19d852700-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.027472 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntmss\" (UniqueName: \"kubernetes.io/projected/65be141c-c319-4770-ad98-eac19d852700-kube-api-access-ntmss\") on node \"crc\" DevicePath \"\"" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.246133 4672 generic.go:334] "Generic (PLEG): container finished" podID="65be141c-c319-4770-ad98-eac19d852700" containerID="6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd" exitCode=0 Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.246196 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c7dk" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.246215 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c7dk" event={"ID":"65be141c-c319-4770-ad98-eac19d852700","Type":"ContainerDied","Data":"6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd"} Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.246517 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c7dk" event={"ID":"65be141c-c319-4770-ad98-eac19d852700","Type":"ContainerDied","Data":"b4d2c43241246cded5e568f2930e22f7a2ca5284394436d123e778f1dbe02640"} Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.246539 4672 scope.go:117] "RemoveContainer" containerID="6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.279588 4672 scope.go:117] "RemoveContainer" containerID="12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.286536 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9c7dk"] Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.297391 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9c7dk"] Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.310548 4672 scope.go:117] "RemoveContainer" containerID="65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.340500 4672 scope.go:117] "RemoveContainer" containerID="6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd" Sep 30 12:45:57 crc kubenswrapper[4672]: E0930 12:45:57.340921 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd\": container with ID starting with 6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd not found: ID does not exist" containerID="6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.340977 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd"} err="failed to get container status \"6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd\": rpc error: code = NotFound desc = could not find container \"6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd\": container with ID starting with 6869884d69009daf7017a01f05e416a356db2b350f33258070d254066cdaf6bd not found: ID does not exist" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.341003 4672 scope.go:117] "RemoveContainer" containerID="12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424" Sep 30 12:45:57 crc kubenswrapper[4672]: E0930 12:45:57.341483 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424\": container with ID starting with 12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424 not found: ID does not exist" containerID="12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.341527 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424"} err="failed to get container status \"12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424\": rpc error: code = NotFound desc = could not find container \"12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424\": container with ID starting with 12d110b142f14724a953b43f930682326952fab114a36af2b7fb983dfe18f424 not found: ID does not exist" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.341557 4672 scope.go:117] "RemoveContainer" containerID="65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809" Sep 30 12:45:57 crc kubenswrapper[4672]: E0930 12:45:57.342017 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809\": container with ID starting with 65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809 not found: ID does not exist" containerID="65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.342046 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809"} err="failed to get container status \"65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809\": rpc error: code = NotFound desc = could not find container \"65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809\": container with ID starting with 65d1cf74c6f58b595e34a19a8dc0f59b61541420c2f70586b6bd04033c325809 not found: ID does not exist" Sep 30 12:45:57 crc kubenswrapper[4672]: E0930 12:45:57.404374 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65be141c_c319_4770_ad98_eac19d852700.slice\": RecentStats: unable to find data in memory cache]" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.444879 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65be141c-c319-4770-ad98-eac19d852700" path="/var/lib/kubelet/pods/65be141c-c319-4770-ad98-eac19d852700/volumes" Sep 30 12:45:57 crc kubenswrapper[4672]: I0930 12:45:57.445767 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd2nc"] Sep 30 12:45:58 crc kubenswrapper[4672]: I0930 12:45:58.270212 4672 generic.go:334] "Generic (PLEG): container finished" podID="c1411591-c321-494b-b3d6-514addfe9b2f" containerID="7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86" exitCode=0 Sep 30 12:45:58 crc kubenswrapper[4672]: I0930 12:45:58.270382 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd2nc" event={"ID":"c1411591-c321-494b-b3d6-514addfe9b2f","Type":"ContainerDied","Data":"7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86"} Sep 30 12:45:58 crc kubenswrapper[4672]: I0930 12:45:58.270623 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd2nc" event={"ID":"c1411591-c321-494b-b3d6-514addfe9b2f","Type":"ContainerStarted","Data":"2b43383524c8325d72adc1a192e3cdd7d46d2ed3d96c706e23ae6cf41970d289"} Sep 30 12:45:58 crc kubenswrapper[4672]: I0930 12:45:58.656166 4672 scope.go:117] "RemoveContainer" containerID="7f1ae11fb0ca93ce8fe36b1259620ff1daf25dc16d12766832eb4aed17a8bd37" Sep 30 12:45:58 crc kubenswrapper[4672]: I0930 12:45:58.681705 4672 scope.go:117] "RemoveContainer" containerID="727f47daecc45501a4a6d01cae978e9dbf2ec76d6ccf8d08f200c2bc625c6c12" Sep 30 12:45:58 crc kubenswrapper[4672]: I0930 12:45:58.704247 4672 scope.go:117] "RemoveContainer" containerID="50186f10fd640297f77522112529fd398a4bee106ed4a1f07bb8fbe619850b5c" Sep 30 12:45:58 crc kubenswrapper[4672]: I0930 12:45:58.729225 4672 scope.go:117] "RemoveContainer" containerID="ffa109781cc7820232abe1a36d8cace242c9308a6cb74b5bcd957de3302677bd" Sep 30 12:46:00 crc kubenswrapper[4672]: I0930 12:46:00.296008 4672 generic.go:334] "Generic (PLEG): container finished" podID="c1411591-c321-494b-b3d6-514addfe9b2f" containerID="20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782" exitCode=0 Sep 30 12:46:00 crc kubenswrapper[4672]: I0930 12:46:00.296092 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd2nc" event={"ID":"c1411591-c321-494b-b3d6-514addfe9b2f","Type":"ContainerDied","Data":"20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782"} Sep 30 12:46:01 crc kubenswrapper[4672]: I0930 12:46:01.309735 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd2nc" event={"ID":"c1411591-c321-494b-b3d6-514addfe9b2f","Type":"ContainerStarted","Data":"a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629"} Sep 30 12:46:01 crc kubenswrapper[4672]: I0930 12:46:01.335123 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fd2nc" podStartSLOduration=2.819541871 podStartE2EDuration="5.335107257s" podCreationTimestamp="2025-09-30 12:45:56 +0000 UTC" firstStartedPulling="2025-09-30 12:45:58.272761207 +0000 UTC m=+1449.541998863" lastFinishedPulling="2025-09-30 12:46:00.788326583 +0000 UTC m=+1452.057564249" observedRunningTime="2025-09-30 12:46:01.326660812 +0000 UTC m=+1452.595898498" watchObservedRunningTime="2025-09-30 12:46:01.335107257 +0000 UTC m=+1452.604344903" Sep 30 12:46:06 crc kubenswrapper[4672]: I0930 12:46:06.993429 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:46:06 crc kubenswrapper[4672]: I0930 12:46:06.994009 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:46:07 crc kubenswrapper[4672]: I0930 12:46:07.054435 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:46:07 crc kubenswrapper[4672]: I0930 12:46:07.428494 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:46:07 crc kubenswrapper[4672]: I0930 12:46:07.484105 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd2nc"] Sep 30 12:46:09 crc kubenswrapper[4672]: I0930 12:46:09.390645 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fd2nc" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="registry-server" containerID="cri-o://a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629" gracePeriod=2 Sep 30 12:46:09 crc kubenswrapper[4672]: I0930 12:46:09.845814 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.017289 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbxbf\" (UniqueName: \"kubernetes.io/projected/c1411591-c321-494b-b3d6-514addfe9b2f-kube-api-access-mbxbf\") pod \"c1411591-c321-494b-b3d6-514addfe9b2f\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.017372 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-utilities\") pod \"c1411591-c321-494b-b3d6-514addfe9b2f\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.017448 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-catalog-content\") pod \"c1411591-c321-494b-b3d6-514addfe9b2f\" (UID: \"c1411591-c321-494b-b3d6-514addfe9b2f\") " Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.018671 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-utilities" (OuterVolumeSpecName: "utilities") pod "c1411591-c321-494b-b3d6-514addfe9b2f" (UID: "c1411591-c321-494b-b3d6-514addfe9b2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.024524 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1411591-c321-494b-b3d6-514addfe9b2f-kube-api-access-mbxbf" (OuterVolumeSpecName: "kube-api-access-mbxbf") pod "c1411591-c321-494b-b3d6-514addfe9b2f" (UID: "c1411591-c321-494b-b3d6-514addfe9b2f"). InnerVolumeSpecName "kube-api-access-mbxbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.031052 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1411591-c321-494b-b3d6-514addfe9b2f" (UID: "c1411591-c321-494b-b3d6-514addfe9b2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.120413 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbxbf\" (UniqueName: \"kubernetes.io/projected/c1411591-c321-494b-b3d6-514addfe9b2f-kube-api-access-mbxbf\") on node \"crc\" DevicePath \"\"" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.120442 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.120452 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1411591-c321-494b-b3d6-514addfe9b2f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.404732 4672 generic.go:334] "Generic (PLEG): container finished" podID="c1411591-c321-494b-b3d6-514addfe9b2f" containerID="a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629" exitCode=0 Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.404770 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd2nc" event={"ID":"c1411591-c321-494b-b3d6-514addfe9b2f","Type":"ContainerDied","Data":"a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629"} Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.404799 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd2nc" event={"ID":"c1411591-c321-494b-b3d6-514addfe9b2f","Type":"ContainerDied","Data":"2b43383524c8325d72adc1a192e3cdd7d46d2ed3d96c706e23ae6cf41970d289"} Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.404818 4672 scope.go:117] "RemoveContainer" containerID="a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.404822 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd2nc" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.437163 4672 scope.go:117] "RemoveContainer" containerID="20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.470608 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd2nc"] Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.475157 4672 scope.go:117] "RemoveContainer" containerID="7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.484604 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd2nc"] Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.517548 4672 scope.go:117] "RemoveContainer" containerID="a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629" Sep 30 12:46:10 crc kubenswrapper[4672]: E0930 12:46:10.518077 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629\": container with ID starting with a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629 not found: ID does not exist" containerID="a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.518115 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629"} err="failed to get container status \"a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629\": rpc error: code = NotFound desc = could not find container \"a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629\": container with ID starting with a6bee954b1741d6bb8d7dab05292bb1b3a06ba34be88a61cc1cfa1ed1c6fb629 not found: ID does not exist" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.518220 4672 scope.go:117] "RemoveContainer" containerID="20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782" Sep 30 12:46:10 crc kubenswrapper[4672]: E0930 12:46:10.518610 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782\": container with ID starting with 20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782 not found: ID does not exist" containerID="20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.518653 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782"} err="failed to get container status \"20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782\": rpc error: code = NotFound desc = could not find container \"20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782\": container with ID starting with 20ce06b7102c6a639d9f79bdad3b6a3999331aa99abf4e17db50a0f8320f0782 not found: ID does not exist" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.518675 4672 scope.go:117] "RemoveContainer" containerID="7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86" Sep 30 12:46:10 crc kubenswrapper[4672]: E0930 12:46:10.519122 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86\": container with ID starting with 7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86 not found: ID does not exist" containerID="7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86" Sep 30 12:46:10 crc kubenswrapper[4672]: I0930 12:46:10.519145 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86"} err="failed to get container status \"7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86\": rpc error: code = NotFound desc = could not find container \"7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86\": container with ID starting with 7fb7678582e2d24974b3d61651e9c31f8b77747dd2c4749462927caf6fd37c86 not found: ID does not exist" Sep 30 12:46:11 crc kubenswrapper[4672]: I0930 12:46:11.458412 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" path="/var/lib/kubelet/pods/c1411591-c321-494b-b3d6-514addfe9b2f/volumes" Sep 30 12:47:58 crc kubenswrapper[4672]: I0930 12:47:58.866737 4672 scope.go:117] "RemoveContainer" containerID="34fdc5f5c46ce3eb65e67516212592b95974b63479f736efa193bb7b8a302e5f" Sep 30 12:47:58 crc kubenswrapper[4672]: I0930 12:47:58.903384 4672 scope.go:117] "RemoveContainer" containerID="8104e13c73285dd412b983216ceeb1293cdb470a134e07a7f30a5dcba8888c27" Sep 30 12:47:58 crc kubenswrapper[4672]: I0930 12:47:58.932678 4672 scope.go:117] "RemoveContainer" containerID="1e4b1d648a001af42c4fc3edcfff3ce6c47e4a4719990f91e36c987ed2aae05e" Sep 30 12:47:58 crc kubenswrapper[4672]: I0930 12:47:58.973507 4672 scope.go:117] "RemoveContainer" containerID="d65d62c2763af66b3ade4ee5ff31c4d209f29c59eaa3edf361c312e723e283cb" Sep 30 12:47:59 crc kubenswrapper[4672]: I0930 12:47:59.010883 4672 scope.go:117] "RemoveContainer" containerID="c09265842872061a899b0c52b341019d9302b7f667a9a226f5d83645e4109631" Sep 30 12:47:59 crc kubenswrapper[4672]: I0930 12:47:59.030876 4672 scope.go:117] "RemoveContainer" containerID="0a804eb630446c62f6f3a6aa97da1e1c639984140d374d30fc951165ee3ed56f" Sep 30 12:47:59 crc kubenswrapper[4672]: I0930 12:47:59.051216 4672 scope.go:117] "RemoveContainer" containerID="727b7ddf43148d767a5094fce03761f89c3e9425add9803090efbd4bf7f2eed1" Sep 30 12:47:59 crc kubenswrapper[4672]: I0930 12:47:59.070041 4672 scope.go:117] "RemoveContainer" containerID="01d4b897510dd6446effc0c6904cf10d6f870592e795df321b3201576f0dde37" Sep 30 12:48:04 crc kubenswrapper[4672]: I0930 12:48:04.051026 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-tqggx"] Sep 30 12:48:04 crc kubenswrapper[4672]: I0930 12:48:04.067147 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gpr5b"] Sep 30 12:48:04 crc kubenswrapper[4672]: I0930 12:48:04.077712 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-tqggx"] Sep 30 12:48:04 crc kubenswrapper[4672]: I0930 12:48:04.086678 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-9sjc4"] Sep 30 12:48:04 crc kubenswrapper[4672]: I0930 12:48:04.094923 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gpr5b"] Sep 30 12:48:04 crc kubenswrapper[4672]: I0930 12:48:04.102459 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-9sjc4"] Sep 30 12:48:05 crc kubenswrapper[4672]: I0930 12:48:05.431909 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a3488a-d32c-4723-b27b-7ec68566fba6" path="/var/lib/kubelet/pods/a5a3488a-d32c-4723-b27b-7ec68566fba6/volumes" Sep 30 12:48:05 crc kubenswrapper[4672]: I0930 12:48:05.434406 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f9649f-28e4-4025-b17b-c21ad09eedee" path="/var/lib/kubelet/pods/b7f9649f-28e4-4025-b17b-c21ad09eedee/volumes" Sep 30 12:48:05 crc kubenswrapper[4672]: I0930 12:48:05.437554 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6da0670-d4cb-4b8c-9a9a-2f82e32a767f" path="/var/lib/kubelet/pods/d6da0670-d4cb-4b8c-9a9a-2f82e32a767f/volumes" Sep 30 12:48:06 crc kubenswrapper[4672]: I0930 12:48:06.028560 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-j56fv"] Sep 30 12:48:06 crc kubenswrapper[4672]: I0930 12:48:06.037873 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-j56fv"] Sep 30 12:48:07 crc kubenswrapper[4672]: I0930 12:48:07.439826 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2" path="/var/lib/kubelet/pods/2c0f6826-1d24-44eb-9bc2-5f666ce0e0a2/volumes" Sep 30 12:48:10 crc kubenswrapper[4672]: I0930 12:48:10.046844 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-bdd5-account-create-vw97t"] Sep 30 12:48:10 crc kubenswrapper[4672]: I0930 12:48:10.056631 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-bdd5-account-create-vw97t"] Sep 30 12:48:11 crc kubenswrapper[4672]: I0930 12:48:11.434518 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d43e73-0852-419a-88c0-de23410cf8b5" path="/var/lib/kubelet/pods/98d43e73-0852-419a-88c0-de23410cf8b5/volumes" Sep 30 12:48:19 crc kubenswrapper[4672]: I0930 12:48:19.794086 4672 generic.go:334] "Generic (PLEG): container finished" podID="91de1b76-2b84-4d21-9683-d7aee98fb876" containerID="03d5fc4b8e4688e768fb7b8785d883df0c4b533b6dc8fd37542cbe6c9c82789f" exitCode=0 Sep 30 12:48:19 crc kubenswrapper[4672]: I0930 12:48:19.794176 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" event={"ID":"91de1b76-2b84-4d21-9683-d7aee98fb876","Type":"ContainerDied","Data":"03d5fc4b8e4688e768fb7b8785d883df0c4b533b6dc8fd37542cbe6c9c82789f"} Sep 30 12:48:20 crc kubenswrapper[4672]: I0930 12:48:20.033845 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5afe-account-create-w6kdx"] Sep 30 12:48:20 crc kubenswrapper[4672]: I0930 12:48:20.045236 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5afe-account-create-w6kdx"] Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.023752 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d431-account-create-v9pw9"] Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.031400 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d431-account-create-v9pw9"] Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.368916 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.434489 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49ba1b9-421a-41ad-9c95-49bd94444de0" path="/var/lib/kubelet/pods/d49ba1b9-421a-41ad-9c95-49bd94444de0/volumes" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.436013 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f7c9ac-7d3f-4477-819c-d5547e1e641d" path="/var/lib/kubelet/pods/f5f7c9ac-7d3f-4477-819c-d5547e1e641d/volumes" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.488198 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-ssh-key\") pod \"91de1b76-2b84-4d21-9683-d7aee98fb876\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.488348 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-bootstrap-combined-ca-bundle\") pod \"91de1b76-2b84-4d21-9683-d7aee98fb876\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.488529 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-inventory\") pod \"91de1b76-2b84-4d21-9683-d7aee98fb876\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.488648 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vpt5\" (UniqueName: \"kubernetes.io/projected/91de1b76-2b84-4d21-9683-d7aee98fb876-kube-api-access-8vpt5\") pod \"91de1b76-2b84-4d21-9683-d7aee98fb876\" (UID: \"91de1b76-2b84-4d21-9683-d7aee98fb876\") " Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.493967 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "91de1b76-2b84-4d21-9683-d7aee98fb876" (UID: "91de1b76-2b84-4d21-9683-d7aee98fb876"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.500324 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91de1b76-2b84-4d21-9683-d7aee98fb876-kube-api-access-8vpt5" (OuterVolumeSpecName: "kube-api-access-8vpt5") pod "91de1b76-2b84-4d21-9683-d7aee98fb876" (UID: "91de1b76-2b84-4d21-9683-d7aee98fb876"). InnerVolumeSpecName "kube-api-access-8vpt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.519484 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-inventory" (OuterVolumeSpecName: "inventory") pod "91de1b76-2b84-4d21-9683-d7aee98fb876" (UID: "91de1b76-2b84-4d21-9683-d7aee98fb876"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.532975 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "91de1b76-2b84-4d21-9683-d7aee98fb876" (UID: "91de1b76-2b84-4d21-9683-d7aee98fb876"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.591439 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.591477 4672 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.591491 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91de1b76-2b84-4d21-9683-d7aee98fb876-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.591503 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vpt5\" (UniqueName: \"kubernetes.io/projected/91de1b76-2b84-4d21-9683-d7aee98fb876-kube-api-access-8vpt5\") on node \"crc\" DevicePath \"\"" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.817287 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" event={"ID":"91de1b76-2b84-4d21-9683-d7aee98fb876","Type":"ContainerDied","Data":"7d29f5825fd5ac6d5168ffca2383e3630ab378f911b2d144c517619e41c1fed8"} Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.817659 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d29f5825fd5ac6d5168ffca2383e3630ab378f911b2d144c517619e41c1fed8" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.817393 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.893586 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8"] Sep 30 12:48:21 crc kubenswrapper[4672]: E0930 12:48:21.893998 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91de1b76-2b84-4d21-9683-d7aee98fb876" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894016 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="91de1b76-2b84-4d21-9683-d7aee98fb876" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 12:48:21 crc kubenswrapper[4672]: E0930 12:48:21.894040 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="extract-content" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894047 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="extract-content" Sep 30 12:48:21 crc kubenswrapper[4672]: E0930 12:48:21.894073 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="extract-utilities" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894081 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="extract-utilities" Sep 30 12:48:21 crc kubenswrapper[4672]: E0930 12:48:21.894096 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="extract-utilities" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894103 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="extract-utilities" Sep 30 12:48:21 crc kubenswrapper[4672]: E0930 12:48:21.894117 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="registry-server" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894124 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="registry-server" Sep 30 12:48:21 crc kubenswrapper[4672]: E0930 12:48:21.894144 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="extract-content" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894150 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="extract-content" Sep 30 12:48:21 crc kubenswrapper[4672]: E0930 12:48:21.894172 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="registry-server" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894179 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="registry-server" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894398 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="91de1b76-2b84-4d21-9683-d7aee98fb876" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894415 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1411591-c321-494b-b3d6-514addfe9b2f" containerName="registry-server" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.894425 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="65be141c-c319-4770-ad98-eac19d852700" containerName="registry-server" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.895128 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.898462 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.898784 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.899184 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.899447 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:48:21 crc kubenswrapper[4672]: I0930 12:48:21.904439 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8"] Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.003694 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zljj7\" (UniqueName: \"kubernetes.io/projected/a88c5cde-cba5-457b-8044-77ed9db4c080-kube-api-access-zljj7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.003820 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.003905 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.105678 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.105833 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.105964 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zljj7\" (UniqueName: \"kubernetes.io/projected/a88c5cde-cba5-457b-8044-77ed9db4c080-kube-api-access-zljj7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.110502 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.113006 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.123376 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zljj7\" (UniqueName: \"kubernetes.io/projected/a88c5cde-cba5-457b-8044-77ed9db4c080-kube-api-access-zljj7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.256648 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.795631 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8"] Sep 30 12:48:22 crc kubenswrapper[4672]: W0930 12:48:22.801731 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda88c5cde_cba5_457b_8044_77ed9db4c080.slice/crio-8b69b08c1af69abdcd0a1ad288d29d53d41461f3072df030ad8f0ec662a9df37 WatchSource:0}: Error finding container 8b69b08c1af69abdcd0a1ad288d29d53d41461f3072df030ad8f0ec662a9df37: Status 404 returned error can't find the container with id 8b69b08c1af69abdcd0a1ad288d29d53d41461f3072df030ad8f0ec662a9df37 Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.806108 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 12:48:22 crc kubenswrapper[4672]: I0930 12:48:22.827978 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" event={"ID":"a88c5cde-cba5-457b-8044-77ed9db4c080","Type":"ContainerStarted","Data":"8b69b08c1af69abdcd0a1ad288d29d53d41461f3072df030ad8f0ec662a9df37"} Sep 30 12:48:23 crc kubenswrapper[4672]: I0930 12:48:23.030649 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-8d32-account-create-bprd4"] Sep 30 12:48:23 crc kubenswrapper[4672]: I0930 12:48:23.043881 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-8d32-account-create-bprd4"] Sep 30 12:48:23 crc kubenswrapper[4672]: I0930 12:48:23.435058 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b8a4b9-3b36-4c93-b4fa-694539eb0e2e" path="/var/lib/kubelet/pods/09b8a4b9-3b36-4c93-b4fa-694539eb0e2e/volumes" Sep 30 12:48:23 crc kubenswrapper[4672]: I0930 12:48:23.838680 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" event={"ID":"a88c5cde-cba5-457b-8044-77ed9db4c080","Type":"ContainerStarted","Data":"b66dc3e502d39556da511d0f2a5fa44cebd37c01f119fe18c388d0a55f02c8fa"} Sep 30 12:48:23 crc kubenswrapper[4672]: I0930 12:48:23.861768 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" podStartSLOduration=2.184620679 podStartE2EDuration="2.861754312s" podCreationTimestamp="2025-09-30 12:48:21 +0000 UTC" firstStartedPulling="2025-09-30 12:48:22.805879614 +0000 UTC m=+1594.075117260" lastFinishedPulling="2025-09-30 12:48:23.483013247 +0000 UTC m=+1594.752250893" observedRunningTime="2025-09-30 12:48:23.857336179 +0000 UTC m=+1595.126573825" watchObservedRunningTime="2025-09-30 12:48:23.861754312 +0000 UTC m=+1595.130991958" Sep 30 12:48:24 crc kubenswrapper[4672]: I0930 12:48:24.739791 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:48:24 crc kubenswrapper[4672]: I0930 12:48:24.739843 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.814950 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gp5c4"] Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.817604 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.826841 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gp5c4"] Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.862180 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-catalog-content\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.862236 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-utilities\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.862364 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cwh2\" (UniqueName: \"kubernetes.io/projected/46194c71-c2f1-419a-a0fc-3628aff3ca92-kube-api-access-8cwh2\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.963886 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-catalog-content\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.964288 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-utilities\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.964451 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-catalog-content\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.964551 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwh2\" (UniqueName: \"kubernetes.io/projected/46194c71-c2f1-419a-a0fc-3628aff3ca92-kube-api-access-8cwh2\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.964989 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-utilities\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:43 crc kubenswrapper[4672]: I0930 12:48:43.991063 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwh2\" (UniqueName: \"kubernetes.io/projected/46194c71-c2f1-419a-a0fc-3628aff3ca92-kube-api-access-8cwh2\") pod \"community-operators-gp5c4\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:44 crc kubenswrapper[4672]: I0930 12:48:44.165330 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:44 crc kubenswrapper[4672]: I0930 12:48:44.699094 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gp5c4"] Sep 30 12:48:45 crc kubenswrapper[4672]: I0930 12:48:45.061356 4672 generic.go:334] "Generic (PLEG): container finished" podID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerID="9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1" exitCode=0 Sep 30 12:48:45 crc kubenswrapper[4672]: I0930 12:48:45.061572 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5c4" event={"ID":"46194c71-c2f1-419a-a0fc-3628aff3ca92","Type":"ContainerDied","Data":"9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1"} Sep 30 12:48:45 crc kubenswrapper[4672]: I0930 12:48:45.061659 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5c4" event={"ID":"46194c71-c2f1-419a-a0fc-3628aff3ca92","Type":"ContainerStarted","Data":"8b4bb0c7d98d25234fd40463348795d861b109455bc067598cb9538725df8c1b"} Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.036047 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sfwqd"] Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.047080 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hv754"] Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.055140 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f2ltm"] Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.062645 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f2ltm"] Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.069868 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sfwqd"] Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.077509 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hv754"] Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.080776 4672 generic.go:334] "Generic (PLEG): container finished" podID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerID="b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5" exitCode=0 Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.080815 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5c4" event={"ID":"46194c71-c2f1-419a-a0fc-3628aff3ca92","Type":"ContainerDied","Data":"b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5"} Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.432447 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc" path="/var/lib/kubelet/pods/6a2caae2-f3d5-4aa5-8b69-cc14b1cb88fc/volumes" Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.433705 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777dfdaa-1eff-4e53-85bb-03c7912d1b86" path="/var/lib/kubelet/pods/777dfdaa-1eff-4e53-85bb-03c7912d1b86/volumes" Sep 30 12:48:47 crc kubenswrapper[4672]: I0930 12:48:47.434465 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f636d1ae-c979-4ed4-b0f2-0a0000504e56" path="/var/lib/kubelet/pods/f636d1ae-c979-4ed4-b0f2-0a0000504e56/volumes" Sep 30 12:48:48 crc kubenswrapper[4672]: I0930 12:48:48.096378 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5c4" event={"ID":"46194c71-c2f1-419a-a0fc-3628aff3ca92","Type":"ContainerStarted","Data":"71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7"} Sep 30 12:48:48 crc kubenswrapper[4672]: I0930 12:48:48.115639 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gp5c4" podStartSLOduration=2.321574491 podStartE2EDuration="5.115616469s" podCreationTimestamp="2025-09-30 12:48:43 +0000 UTC" firstStartedPulling="2025-09-30 12:48:45.063462378 +0000 UTC m=+1616.332700024" lastFinishedPulling="2025-09-30 12:48:47.857504356 +0000 UTC m=+1619.126742002" observedRunningTime="2025-09-30 12:48:48.11254589 +0000 UTC m=+1619.381783536" watchObservedRunningTime="2025-09-30 12:48:48.115616469 +0000 UTC m=+1619.384854125" Sep 30 12:48:54 crc kubenswrapper[4672]: I0930 12:48:54.038944 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hl8kd"] Sep 30 12:48:54 crc kubenswrapper[4672]: I0930 12:48:54.052401 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hl8kd"] Sep 30 12:48:54 crc kubenswrapper[4672]: I0930 12:48:54.165943 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:54 crc kubenswrapper[4672]: I0930 12:48:54.165990 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:54 crc kubenswrapper[4672]: I0930 12:48:54.221161 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:54 crc kubenswrapper[4672]: I0930 12:48:54.739658 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:48:54 crc kubenswrapper[4672]: I0930 12:48:54.739723 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:48:55 crc kubenswrapper[4672]: I0930 12:48:55.032318 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-4ddgl"] Sep 30 12:48:55 crc kubenswrapper[4672]: I0930 12:48:55.040622 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-4ddgl"] Sep 30 12:48:55 crc kubenswrapper[4672]: I0930 12:48:55.222340 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:55 crc kubenswrapper[4672]: I0930 12:48:55.427670 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb781ee-9755-4258-bd4a-165461961834" path="/var/lib/kubelet/pods/3cb781ee-9755-4258-bd4a-165461961834/volumes" Sep 30 12:48:55 crc kubenswrapper[4672]: I0930 12:48:55.428722 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee403f37-87a0-4068-b310-f178eb87ddf4" path="/var/lib/kubelet/pods/ee403f37-87a0-4068-b310-f178eb87ddf4/volumes" Sep 30 12:48:55 crc kubenswrapper[4672]: I0930 12:48:55.805572 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gp5c4"] Sep 30 12:48:57 crc kubenswrapper[4672]: I0930 12:48:57.188124 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gp5c4" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="registry-server" containerID="cri-o://71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7" gracePeriod=2 Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.135674 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.198458 4672 generic.go:334] "Generic (PLEG): container finished" podID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerID="71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7" exitCode=0 Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.198504 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5c4" event={"ID":"46194c71-c2f1-419a-a0fc-3628aff3ca92","Type":"ContainerDied","Data":"71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7"} Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.198543 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5c4" event={"ID":"46194c71-c2f1-419a-a0fc-3628aff3ca92","Type":"ContainerDied","Data":"8b4bb0c7d98d25234fd40463348795d861b109455bc067598cb9538725df8c1b"} Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.198561 4672 scope.go:117] "RemoveContainer" containerID="71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.198581 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5c4" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.225357 4672 scope.go:117] "RemoveContainer" containerID="b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.244171 4672 scope.go:117] "RemoveContainer" containerID="9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.291277 4672 scope.go:117] "RemoveContainer" containerID="71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7" Sep 30 12:48:58 crc kubenswrapper[4672]: E0930 12:48:58.291784 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7\": container with ID starting with 71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7 not found: ID does not exist" containerID="71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.291814 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7"} err="failed to get container status \"71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7\": rpc error: code = NotFound desc = could not find container \"71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7\": container with ID starting with 71636fd78761a5382b4c81211cee6f38dbfcf2a9245787dc0760859cf78246b7 not found: ID does not exist" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.291835 4672 scope.go:117] "RemoveContainer" containerID="b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.291860 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-utilities\") pod \"46194c71-c2f1-419a-a0fc-3628aff3ca92\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " Sep 30 12:48:58 crc kubenswrapper[4672]: E0930 12:48:58.292227 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5\": container with ID starting with b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5 not found: ID does not exist" containerID="b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.292301 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5"} err="failed to get container status \"b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5\": rpc error: code = NotFound desc = could not find container \"b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5\": container with ID starting with b756dc15c5a09ddd41eaa80dce2c0ff663aa653161c991b353b6a5fd352400d5 not found: ID does not exist" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.292337 4672 scope.go:117] "RemoveContainer" containerID="9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.292627 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-catalog-content\") pod \"46194c71-c2f1-419a-a0fc-3628aff3ca92\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.292714 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cwh2\" (UniqueName: \"kubernetes.io/projected/46194c71-c2f1-419a-a0fc-3628aff3ca92-kube-api-access-8cwh2\") pod \"46194c71-c2f1-419a-a0fc-3628aff3ca92\" (UID: \"46194c71-c2f1-419a-a0fc-3628aff3ca92\") " Sep 30 12:48:58 crc kubenswrapper[4672]: E0930 12:48:58.292798 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1\": container with ID starting with 9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1 not found: ID does not exist" containerID="9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.292837 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1"} err="failed to get container status \"9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1\": rpc error: code = NotFound desc = could not find container \"9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1\": container with ID starting with 9cd70db56331c6d5d9f48bfaac1afe5293eb354debdbd09fc4d74509d8239ac1 not found: ID does not exist" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.293533 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-utilities" (OuterVolumeSpecName: "utilities") pod "46194c71-c2f1-419a-a0fc-3628aff3ca92" (UID: "46194c71-c2f1-419a-a0fc-3628aff3ca92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.299132 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46194c71-c2f1-419a-a0fc-3628aff3ca92-kube-api-access-8cwh2" (OuterVolumeSpecName: "kube-api-access-8cwh2") pod "46194c71-c2f1-419a-a0fc-3628aff3ca92" (UID: "46194c71-c2f1-419a-a0fc-3628aff3ca92"). InnerVolumeSpecName "kube-api-access-8cwh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.351843 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46194c71-c2f1-419a-a0fc-3628aff3ca92" (UID: "46194c71-c2f1-419a-a0fc-3628aff3ca92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.395275 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.395318 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46194c71-c2f1-419a-a0fc-3628aff3ca92-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.395331 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cwh2\" (UniqueName: \"kubernetes.io/projected/46194c71-c2f1-419a-a0fc-3628aff3ca92-kube-api-access-8cwh2\") on node \"crc\" DevicePath \"\"" Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.540292 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gp5c4"] Sep 30 12:48:58 crc kubenswrapper[4672]: I0930 12:48:58.548776 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gp5c4"] Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.029851 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-47f4-account-create-cw9r4"] Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.038380 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-47f4-account-create-cw9r4"] Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.144366 4672 scope.go:117] "RemoveContainer" containerID="568dfedaa369dfbd8aa6a30e85bc44fe065bad4bb0fcf83751bc2b792cf9bca9" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.181379 4672 scope.go:117] "RemoveContainer" containerID="16e69782ae512be3a27f3c323a122b936e1ffd08f6941ff938944265a3ed4d98" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.230834 4672 scope.go:117] "RemoveContainer" containerID="a2f1002056dc934bfcb723781956fdb01175b6fa04dcbce2958649a062ab332d" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.280492 4672 scope.go:117] "RemoveContainer" containerID="3dcabd74158f182a797f4abe9811f1ebc96b6d0a03bd659a6b40003f62f28103" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.317446 4672 scope.go:117] "RemoveContainer" containerID="2223dd6fc8ffbac6d026daedd107cb8834ea7fcabbd224601c17a1aeb1a95410" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.358766 4672 scope.go:117] "RemoveContainer" containerID="51aa37842efa35e5eed9529311a059dc19562e79b0d9db70ef66720087a3d59c" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.409694 4672 scope.go:117] "RemoveContainer" containerID="7b4d8e172b6acb6370418c6e94dc1b4ab74e6ab4143913675935e49352dd87c8" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.439473 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" path="/var/lib/kubelet/pods/46194c71-c2f1-419a-a0fc-3628aff3ca92/volumes" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.440095 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d372bb6e-78aa-4329-b0f7-f2685c342bd3" path="/var/lib/kubelet/pods/d372bb6e-78aa-4329-b0f7-f2685c342bd3/volumes" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.454406 4672 scope.go:117] "RemoveContainer" containerID="6b63f5c1c5efc931527e000897c4386c666368c95cda2db4a5840e3c5b48f8a4" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.492234 4672 scope.go:117] "RemoveContainer" containerID="343d7a17c97fa9b0d84265f7cd3a3f7d50155c98e64836dcbcf872159384b2e2" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.535786 4672 scope.go:117] "RemoveContainer" containerID="d96474d4b2160b6b5c841b736ae86bf936d0e70552b42921a134a127d6ae1e47" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.585018 4672 scope.go:117] "RemoveContainer" containerID="b187d17de7439ed5f2364a3ddffb81c66e37d36f2d37f2c83dc70016bfee73ca" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.612880 4672 scope.go:117] "RemoveContainer" containerID="af2978cc2b1acc0e874ae9b3657f79fdacb65e72f70dffff3bbf43f95a861057" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.644493 4672 scope.go:117] "RemoveContainer" containerID="42b0421f9782d814de1249138402274f5cdd8afe3f2dd52f8bd5b0e49b42ce23" Sep 30 12:48:59 crc kubenswrapper[4672]: I0930 12:48:59.667467 4672 scope.go:117] "RemoveContainer" containerID="587c158adfe5ea1324c63d3ca415b396faa4cb8e2ca7fa5c7ac02497e329a7aa" Sep 30 12:49:00 crc kubenswrapper[4672]: I0930 12:49:00.032512 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-55f8-account-create-j8z8l"] Sep 30 12:49:00 crc kubenswrapper[4672]: I0930 12:49:00.041400 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fe85-account-create-vdvbq"] Sep 30 12:49:00 crc kubenswrapper[4672]: I0930 12:49:00.051607 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-55f8-account-create-j8z8l"] Sep 30 12:49:00 crc kubenswrapper[4672]: I0930 12:49:00.059349 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fe85-account-create-vdvbq"] Sep 30 12:49:01 crc kubenswrapper[4672]: I0930 12:49:01.429156 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43cef9cd-4246-4707-93c8-2d1e0b49a99c" path="/var/lib/kubelet/pods/43cef9cd-4246-4707-93c8-2d1e0b49a99c/volumes" Sep 30 12:49:01 crc kubenswrapper[4672]: I0930 12:49:01.430173 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8" path="/var/lib/kubelet/pods/e7b77dbe-0a6b-4bef-a55d-3de78b2c25f8/volumes" Sep 30 12:49:04 crc kubenswrapper[4672]: I0930 12:49:04.036934 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-2w94k"] Sep 30 12:49:04 crc kubenswrapper[4672]: I0930 12:49:04.049549 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-2w94k"] Sep 30 12:49:05 crc kubenswrapper[4672]: I0930 12:49:05.427827 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728a5c57-a6b8-4201-bbff-34f2eee54b1a" path="/var/lib/kubelet/pods/728a5c57-a6b8-4201-bbff-34f2eee54b1a/volumes" Sep 30 12:49:24 crc kubenswrapper[4672]: I0930 12:49:24.739706 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:49:24 crc kubenswrapper[4672]: I0930 12:49:24.740649 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:49:24 crc kubenswrapper[4672]: I0930 12:49:24.740720 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:49:24 crc kubenswrapper[4672]: I0930 12:49:24.742018 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:49:24 crc kubenswrapper[4672]: I0930 12:49:24.742139 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" gracePeriod=600 Sep 30 12:49:24 crc kubenswrapper[4672]: E0930 12:49:24.873226 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:49:25 crc kubenswrapper[4672]: I0930 12:49:25.547816 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" exitCode=0 Sep 30 12:49:25 crc kubenswrapper[4672]: I0930 12:49:25.547930 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e"} Sep 30 12:49:25 crc kubenswrapper[4672]: I0930 12:49:25.548301 4672 scope.go:117] "RemoveContainer" containerID="dae9c6261457b92ffd63bbf3a8283d8d78ab2aabf36d75dffc0e7ed0851a4fe4" Sep 30 12:49:25 crc kubenswrapper[4672]: I0930 12:49:25.549319 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:49:25 crc kubenswrapper[4672]: E0930 12:49:25.549771 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:49:26 crc kubenswrapper[4672]: I0930 12:49:26.051783 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-stw4v"] Sep 30 12:49:26 crc kubenswrapper[4672]: I0930 12:49:26.064418 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-stw4v"] Sep 30 12:49:27 crc kubenswrapper[4672]: I0930 12:49:27.428203 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c431961-0987-401e-8486-c2dd0887721b" path="/var/lib/kubelet/pods/8c431961-0987-401e-8486-c2dd0887721b/volumes" Sep 30 12:49:40 crc kubenswrapper[4672]: I0930 12:49:40.416710 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:49:40 crc kubenswrapper[4672]: E0930 12:49:40.417447 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:49:45 crc kubenswrapper[4672]: I0930 12:49:45.058379 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9kp7h"] Sep 30 12:49:45 crc kubenswrapper[4672]: I0930 12:49:45.069381 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9kp7h"] Sep 30 12:49:45 crc kubenswrapper[4672]: I0930 12:49:45.435790 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87bb11c-a04e-4018-ba41-d628795a926e" path="/var/lib/kubelet/pods/c87bb11c-a04e-4018-ba41-d628795a926e/volumes" Sep 30 12:49:47 crc kubenswrapper[4672]: I0930 12:49:47.029019 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9cnkc"] Sep 30 12:49:47 crc kubenswrapper[4672]: I0930 12:49:47.041681 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9cnkc"] Sep 30 12:49:47 crc kubenswrapper[4672]: I0930 12:49:47.432061 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d637de0b-aed1-45ef-9d86-c6f7c2f188e1" path="/var/lib/kubelet/pods/d637de0b-aed1-45ef-9d86-c6f7c2f188e1/volumes" Sep 30 12:49:52 crc kubenswrapper[4672]: I0930 12:49:52.416999 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:49:52 crc kubenswrapper[4672]: E0930 12:49:52.417820 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:49:57 crc kubenswrapper[4672]: I0930 12:49:57.031983 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rd9t8"] Sep 30 12:49:57 crc kubenswrapper[4672]: I0930 12:49:57.040856 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rd9t8"] Sep 30 12:49:57 crc kubenswrapper[4672]: I0930 12:49:57.429071 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db83994c-a577-4d20-a544-3950abb7273b" path="/var/lib/kubelet/pods/db83994c-a577-4d20-a544-3950abb7273b/volumes" Sep 30 12:49:59 crc kubenswrapper[4672]: I0930 12:49:59.931971 4672 scope.go:117] "RemoveContainer" containerID="32df13751c16a9753fbac6226c8a88e584bc29b566655afd5f987b5175e2b5bd" Sep 30 12:49:59 crc kubenswrapper[4672]: I0930 12:49:59.967956 4672 scope.go:117] "RemoveContainer" containerID="c6671192b403ba0381d0c338083170ddad87278a332092c99bbfe68b29def9df" Sep 30 12:50:00 crc kubenswrapper[4672]: I0930 12:50:00.017698 4672 scope.go:117] "RemoveContainer" containerID="2ff2e0394a73d16946196cb54745bc30f70ac4238c5fc730f22b5d9ec507751f" Sep 30 12:50:00 crc kubenswrapper[4672]: I0930 12:50:00.087283 4672 scope.go:117] "RemoveContainer" containerID="0371a9eaaabf25ee5b460fadf31d64d04b11ff141406d3f2d9776b271f6432d1" Sep 30 12:50:00 crc kubenswrapper[4672]: I0930 12:50:00.129618 4672 scope.go:117] "RemoveContainer" containerID="1f3d33c5e0e339b002c349b5c492a03d7dbd38148a6d0a3d902d7a6298e9d6fc" Sep 30 12:50:00 crc kubenswrapper[4672]: I0930 12:50:00.171007 4672 scope.go:117] "RemoveContainer" containerID="de810467bd2c80cebb514631aa663893dcb863c7f06ea2fde6b0efb04a9da6bf" Sep 30 12:50:00 crc kubenswrapper[4672]: I0930 12:50:00.219263 4672 scope.go:117] "RemoveContainer" containerID="c67a7804189bf25b06b8879f6f5ca3c83481b53a850e69c71c8db005c920aca8" Sep 30 12:50:05 crc kubenswrapper[4672]: I0930 12:50:05.417153 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:50:05 crc kubenswrapper[4672]: E0930 12:50:05.417924 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:50:10 crc kubenswrapper[4672]: I0930 12:50:10.053730 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-llp7f"] Sep 30 12:50:10 crc kubenswrapper[4672]: I0930 12:50:10.067830 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-llp7f"] Sep 30 12:50:11 crc kubenswrapper[4672]: I0930 12:50:11.437972 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a" path="/var/lib/kubelet/pods/9b56cfc5-1c9c-41f5-91ac-b0fbd79c7b6a/volumes" Sep 30 12:50:15 crc kubenswrapper[4672]: I0930 12:50:15.078468 4672 generic.go:334] "Generic (PLEG): container finished" podID="a88c5cde-cba5-457b-8044-77ed9db4c080" containerID="b66dc3e502d39556da511d0f2a5fa44cebd37c01f119fe18c388d0a55f02c8fa" exitCode=0 Sep 30 12:50:15 crc kubenswrapper[4672]: I0930 12:50:15.078522 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" event={"ID":"a88c5cde-cba5-457b-8044-77ed9db4c080","Type":"ContainerDied","Data":"b66dc3e502d39556da511d0f2a5fa44cebd37c01f119fe18c388d0a55f02c8fa"} Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.531194 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.609479 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zljj7\" (UniqueName: \"kubernetes.io/projected/a88c5cde-cba5-457b-8044-77ed9db4c080-kube-api-access-zljj7\") pod \"a88c5cde-cba5-457b-8044-77ed9db4c080\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.609538 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-ssh-key\") pod \"a88c5cde-cba5-457b-8044-77ed9db4c080\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.609590 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-inventory\") pod \"a88c5cde-cba5-457b-8044-77ed9db4c080\" (UID: \"a88c5cde-cba5-457b-8044-77ed9db4c080\") " Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.616611 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a88c5cde-cba5-457b-8044-77ed9db4c080-kube-api-access-zljj7" (OuterVolumeSpecName: "kube-api-access-zljj7") pod "a88c5cde-cba5-457b-8044-77ed9db4c080" (UID: "a88c5cde-cba5-457b-8044-77ed9db4c080"). InnerVolumeSpecName "kube-api-access-zljj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.655962 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a88c5cde-cba5-457b-8044-77ed9db4c080" (UID: "a88c5cde-cba5-457b-8044-77ed9db4c080"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.661184 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-inventory" (OuterVolumeSpecName: "inventory") pod "a88c5cde-cba5-457b-8044-77ed9db4c080" (UID: "a88c5cde-cba5-457b-8044-77ed9db4c080"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.712319 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zljj7\" (UniqueName: \"kubernetes.io/projected/a88c5cde-cba5-457b-8044-77ed9db4c080-kube-api-access-zljj7\") on node \"crc\" DevicePath \"\"" Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.712355 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:50:16 crc kubenswrapper[4672]: I0930 12:50:16.712369 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a88c5cde-cba5-457b-8044-77ed9db4c080-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.097566 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" event={"ID":"a88c5cde-cba5-457b-8044-77ed9db4c080","Type":"ContainerDied","Data":"8b69b08c1af69abdcd0a1ad288d29d53d41461f3072df030ad8f0ec662a9df37"} Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.097901 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b69b08c1af69abdcd0a1ad288d29d53d41461f3072df030ad8f0ec662a9df37" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.097583 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.175569 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n"] Sep 30 12:50:17 crc kubenswrapper[4672]: E0930 12:50:17.176070 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="extract-content" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.176097 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="extract-content" Sep 30 12:50:17 crc kubenswrapper[4672]: E0930 12:50:17.176117 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="extract-utilities" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.176126 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="extract-utilities" Sep 30 12:50:17 crc kubenswrapper[4672]: E0930 12:50:17.176140 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88c5cde-cba5-457b-8044-77ed9db4c080" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.176150 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88c5cde-cba5-457b-8044-77ed9db4c080" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 12:50:17 crc kubenswrapper[4672]: E0930 12:50:17.176174 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="registry-server" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.176182 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="registry-server" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.176491 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="46194c71-c2f1-419a-a0fc-3628aff3ca92" containerName="registry-server" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.176524 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a88c5cde-cba5-457b-8044-77ed9db4c080" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.177347 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.179498 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.180235 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.180475 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.180774 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.183949 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n"] Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.324377 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.324473 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnjps\" (UniqueName: \"kubernetes.io/projected/6354da04-65da-4562-9e78-563e1fb4f4fe-kube-api-access-bnjps\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.324625 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.426379 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnjps\" (UniqueName: \"kubernetes.io/projected/6354da04-65da-4562-9e78-563e1fb4f4fe-kube-api-access-bnjps\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.426552 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.426614 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.431427 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.431874 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.444813 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnjps\" (UniqueName: \"kubernetes.io/projected/6354da04-65da-4562-9e78-563e1fb4f4fe-kube-api-access-bnjps\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kft7n\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:17 crc kubenswrapper[4672]: I0930 12:50:17.495397 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:50:18 crc kubenswrapper[4672]: I0930 12:50:18.027469 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n"] Sep 30 12:50:18 crc kubenswrapper[4672]: I0930 12:50:18.107877 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" event={"ID":"6354da04-65da-4562-9e78-563e1fb4f4fe","Type":"ContainerStarted","Data":"9ff8e26233e3acb32525a26adbea697102120ef89025c8bd90f4830a922f98dc"} Sep 30 12:50:19 crc kubenswrapper[4672]: I0930 12:50:19.130275 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" event={"ID":"6354da04-65da-4562-9e78-563e1fb4f4fe","Type":"ContainerStarted","Data":"c9087ec48ee386c85787f5dc0f0181247bf867db1be14aa55d628aefa98e4c22"} Sep 30 12:50:19 crc kubenswrapper[4672]: I0930 12:50:19.151111 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" podStartSLOduration=1.6821990439999999 podStartE2EDuration="2.151090214s" podCreationTimestamp="2025-09-30 12:50:17 +0000 UTC" firstStartedPulling="2025-09-30 12:50:18.035038115 +0000 UTC m=+1709.304275761" lastFinishedPulling="2025-09-30 12:50:18.503929285 +0000 UTC m=+1709.773166931" observedRunningTime="2025-09-30 12:50:19.148892237 +0000 UTC m=+1710.418129873" watchObservedRunningTime="2025-09-30 12:50:19.151090214 +0000 UTC m=+1710.420327870" Sep 30 12:50:20 crc kubenswrapper[4672]: I0930 12:50:20.416766 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:50:20 crc kubenswrapper[4672]: E0930 12:50:20.417022 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:50:31 crc kubenswrapper[4672]: I0930 12:50:31.418003 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:50:31 crc kubenswrapper[4672]: E0930 12:50:31.418899 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:50:46 crc kubenswrapper[4672]: I0930 12:50:46.050644 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9gpdk"] Sep 30 12:50:46 crc kubenswrapper[4672]: I0930 12:50:46.060328 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9gpdk"] Sep 30 12:50:46 crc kubenswrapper[4672]: I0930 12:50:46.417174 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:50:46 crc kubenswrapper[4672]: E0930 12:50:46.417612 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:50:47 crc kubenswrapper[4672]: I0930 12:50:47.025620 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-88sxc"] Sep 30 12:50:47 crc kubenswrapper[4672]: I0930 12:50:47.033579 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fz28q"] Sep 30 12:50:47 crc kubenswrapper[4672]: I0930 12:50:47.043368 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-88sxc"] Sep 30 12:50:47 crc kubenswrapper[4672]: I0930 12:50:47.051131 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fz28q"] Sep 30 12:50:47 crc kubenswrapper[4672]: I0930 12:50:47.434731 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89525b90-403a-4caf-9443-3e041be85426" path="/var/lib/kubelet/pods/89525b90-403a-4caf-9443-3e041be85426/volumes" Sep 30 12:50:47 crc kubenswrapper[4672]: I0930 12:50:47.437075 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc00fe07-7d49-4ca0-818d-fb129880f480" path="/var/lib/kubelet/pods/dc00fe07-7d49-4ca0-818d-fb129880f480/volumes" Sep 30 12:50:47 crc kubenswrapper[4672]: I0930 12:50:47.438872 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4260988-bcca-456e-a745-ee5789116a84" path="/var/lib/kubelet/pods/f4260988-bcca-456e-a745-ee5789116a84/volumes" Sep 30 12:50:56 crc kubenswrapper[4672]: I0930 12:50:56.034601 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6afc-account-create-tbkp4"] Sep 30 12:50:56 crc kubenswrapper[4672]: I0930 12:50:56.052580 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-21c1-account-create-bsmfm"] Sep 30 12:50:56 crc kubenswrapper[4672]: I0930 12:50:56.062404 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6afc-account-create-tbkp4"] Sep 30 12:50:56 crc kubenswrapper[4672]: I0930 12:50:56.073376 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-21c1-account-create-bsmfm"] Sep 30 12:50:57 crc kubenswrapper[4672]: I0930 12:50:57.027911 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fe4a-account-create-hllt5"] Sep 30 12:50:57 crc kubenswrapper[4672]: I0930 12:50:57.038707 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fe4a-account-create-hllt5"] Sep 30 12:50:57 crc kubenswrapper[4672]: I0930 12:50:57.431874 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89348083-7def-4e69-9d40-01729f2c4dfb" path="/var/lib/kubelet/pods/89348083-7def-4e69-9d40-01729f2c4dfb/volumes" Sep 30 12:50:57 crc kubenswrapper[4672]: I0930 12:50:57.432956 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9274280d-7ccf-42c3-a808-ee313827e49f" path="/var/lib/kubelet/pods/9274280d-7ccf-42c3-a808-ee313827e49f/volumes" Sep 30 12:50:57 crc kubenswrapper[4672]: I0930 12:50:57.433659 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0dd29d3-91bd-4225-bf8b-c1eb7889b373" path="/var/lib/kubelet/pods/b0dd29d3-91bd-4225-bf8b-c1eb7889b373/volumes" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.416936 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:51:00 crc kubenswrapper[4672]: E0930 12:51:00.417563 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.422941 4672 scope.go:117] "RemoveContainer" containerID="42c89d97336c23da4d29a0127fb5c57aa98d8925028c7cd367a3ac75cf1c484f" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.448481 4672 scope.go:117] "RemoveContainer" containerID="40a4bdd26eb215a80d023f5bc8f7474f038dd5c1192cea14e18c5bddfe8626f0" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.511537 4672 scope.go:117] "RemoveContainer" containerID="edb9672464a24ea19910f76e267ccd6e0599d16dc31244b6113ec39e94c2f348" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.563188 4672 scope.go:117] "RemoveContainer" containerID="2892b2b48724eb1990bc8ac7408f7ba6b20bace5d1c2520ba2c5178d1cde4560" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.612829 4672 scope.go:117] "RemoveContainer" containerID="3bb34e6aadfc75f62d8e46b67fd7d4741bac832693be48a91af15fa2e6e8490e" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.650863 4672 scope.go:117] "RemoveContainer" containerID="5f7f4e9ac114726fc90f4ad308528ac4dbe89f619c61fc2f62de94c83c3c02de" Sep 30 12:51:00 crc kubenswrapper[4672]: I0930 12:51:00.697146 4672 scope.go:117] "RemoveContainer" containerID="c9b8ac0d2c2944aef03ffe7f96160d4edab5aa59318146e34fd05f492a680c0d" Sep 30 12:51:14 crc kubenswrapper[4672]: I0930 12:51:14.417698 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:51:14 crc kubenswrapper[4672]: E0930 12:51:14.418591 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:51:25 crc kubenswrapper[4672]: I0930 12:51:25.054415 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l2tgm"] Sep 30 12:51:25 crc kubenswrapper[4672]: I0930 12:51:25.063929 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l2tgm"] Sep 30 12:51:25 crc kubenswrapper[4672]: I0930 12:51:25.429964 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a77c51f-e03c-4804-ba8c-90507e73e279" path="/var/lib/kubelet/pods/7a77c51f-e03c-4804-ba8c-90507e73e279/volumes" Sep 30 12:51:28 crc kubenswrapper[4672]: I0930 12:51:28.418339 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:51:28 crc kubenswrapper[4672]: E0930 12:51:28.419099 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:51:36 crc kubenswrapper[4672]: I0930 12:51:36.936569 4672 generic.go:334] "Generic (PLEG): container finished" podID="6354da04-65da-4562-9e78-563e1fb4f4fe" containerID="c9087ec48ee386c85787f5dc0f0181247bf867db1be14aa55d628aefa98e4c22" exitCode=0 Sep 30 12:51:36 crc kubenswrapper[4672]: I0930 12:51:36.936686 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" event={"ID":"6354da04-65da-4562-9e78-563e1fb4f4fe","Type":"ContainerDied","Data":"c9087ec48ee386c85787f5dc0f0181247bf867db1be14aa55d628aefa98e4c22"} Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.396393 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.444761 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnjps\" (UniqueName: \"kubernetes.io/projected/6354da04-65da-4562-9e78-563e1fb4f4fe-kube-api-access-bnjps\") pod \"6354da04-65da-4562-9e78-563e1fb4f4fe\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.444821 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-inventory\") pod \"6354da04-65da-4562-9e78-563e1fb4f4fe\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.444891 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-ssh-key\") pod \"6354da04-65da-4562-9e78-563e1fb4f4fe\" (UID: \"6354da04-65da-4562-9e78-563e1fb4f4fe\") " Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.451057 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6354da04-65da-4562-9e78-563e1fb4f4fe-kube-api-access-bnjps" (OuterVolumeSpecName: "kube-api-access-bnjps") pod "6354da04-65da-4562-9e78-563e1fb4f4fe" (UID: "6354da04-65da-4562-9e78-563e1fb4f4fe"). InnerVolumeSpecName "kube-api-access-bnjps". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.485485 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6354da04-65da-4562-9e78-563e1fb4f4fe" (UID: "6354da04-65da-4562-9e78-563e1fb4f4fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.486669 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-inventory" (OuterVolumeSpecName: "inventory") pod "6354da04-65da-4562-9e78-563e1fb4f4fe" (UID: "6354da04-65da-4562-9e78-563e1fb4f4fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.548470 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnjps\" (UniqueName: \"kubernetes.io/projected/6354da04-65da-4562-9e78-563e1fb4f4fe-kube-api-access-bnjps\") on node \"crc\" DevicePath \"\"" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.548534 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.548551 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6354da04-65da-4562-9e78-563e1fb4f4fe-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.959938 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" event={"ID":"6354da04-65da-4562-9e78-563e1fb4f4fe","Type":"ContainerDied","Data":"9ff8e26233e3acb32525a26adbea697102120ef89025c8bd90f4830a922f98dc"} Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.960473 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ff8e26233e3acb32525a26adbea697102120ef89025c8bd90f4830a922f98dc" Sep 30 12:51:38 crc kubenswrapper[4672]: I0930 12:51:38.959999 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kft7n" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.080822 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2"] Sep 30 12:51:39 crc kubenswrapper[4672]: E0930 12:51:39.082413 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6354da04-65da-4562-9e78-563e1fb4f4fe" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.082440 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6354da04-65da-4562-9e78-563e1fb4f4fe" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.082650 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6354da04-65da-4562-9e78-563e1fb4f4fe" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.083379 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.085701 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.086333 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.086644 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.086984 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.094191 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2"] Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.162482 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.162555 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4zp7\" (UniqueName: \"kubernetes.io/projected/beb806bb-fd09-449f-939b-cccb4ffe11de-kube-api-access-h4zp7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.162691 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.264493 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.264539 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4zp7\" (UniqueName: \"kubernetes.io/projected/beb806bb-fd09-449f-939b-cccb4ffe11de-kube-api-access-h4zp7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.264590 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.268968 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.269014 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.281958 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4zp7\" (UniqueName: \"kubernetes.io/projected/beb806bb-fd09-449f-939b-cccb4ffe11de-kube-api-access-h4zp7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.415735 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:39 crc kubenswrapper[4672]: I0930 12:51:39.991077 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2"] Sep 30 12:51:40 crc kubenswrapper[4672]: I0930 12:51:40.993086 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" event={"ID":"beb806bb-fd09-449f-939b-cccb4ffe11de","Type":"ContainerStarted","Data":"a840198243abb26c19c0a19cc1f85c1697c6bd92063e99478db9f125fc04533e"} Sep 30 12:51:40 crc kubenswrapper[4672]: I0930 12:51:40.994528 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" event={"ID":"beb806bb-fd09-449f-939b-cccb4ffe11de","Type":"ContainerStarted","Data":"189ee97e66815ae3d4d034a971c84724c982d4e078026ca2f2c20c9d50c44565"} Sep 30 12:51:41 crc kubenswrapper[4672]: I0930 12:51:41.020211 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" podStartSLOduration=1.530451454 podStartE2EDuration="2.020187777s" podCreationTimestamp="2025-09-30 12:51:39 +0000 UTC" firstStartedPulling="2025-09-30 12:51:39.98965892 +0000 UTC m=+1791.258896576" lastFinishedPulling="2025-09-30 12:51:40.479395253 +0000 UTC m=+1791.748632899" observedRunningTime="2025-09-30 12:51:41.013107587 +0000 UTC m=+1792.282345233" watchObservedRunningTime="2025-09-30 12:51:41.020187777 +0000 UTC m=+1792.289425423" Sep 30 12:51:42 crc kubenswrapper[4672]: I0930 12:51:42.417636 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:51:42 crc kubenswrapper[4672]: E0930 12:51:42.418620 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:51:45 crc kubenswrapper[4672]: I0930 12:51:45.034979 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-zmhvt"] Sep 30 12:51:45 crc kubenswrapper[4672]: I0930 12:51:45.043844 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-zmhvt"] Sep 30 12:51:45 crc kubenswrapper[4672]: I0930 12:51:45.429980 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645ec6e3-a10c-4651-9df2-a8259bcd51b9" path="/var/lib/kubelet/pods/645ec6e3-a10c-4651-9df2-a8259bcd51b9/volumes" Sep 30 12:51:46 crc kubenswrapper[4672]: I0930 12:51:46.050531 4672 generic.go:334] "Generic (PLEG): container finished" podID="beb806bb-fd09-449f-939b-cccb4ffe11de" containerID="a840198243abb26c19c0a19cc1f85c1697c6bd92063e99478db9f125fc04533e" exitCode=0 Sep 30 12:51:46 crc kubenswrapper[4672]: I0930 12:51:46.050579 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" event={"ID":"beb806bb-fd09-449f-939b-cccb4ffe11de","Type":"ContainerDied","Data":"a840198243abb26c19c0a19cc1f85c1697c6bd92063e99478db9f125fc04533e"} Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.447608 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.643484 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-ssh-key\") pod \"beb806bb-fd09-449f-939b-cccb4ffe11de\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.643846 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-inventory\") pod \"beb806bb-fd09-449f-939b-cccb4ffe11de\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.644100 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4zp7\" (UniqueName: \"kubernetes.io/projected/beb806bb-fd09-449f-939b-cccb4ffe11de-kube-api-access-h4zp7\") pod \"beb806bb-fd09-449f-939b-cccb4ffe11de\" (UID: \"beb806bb-fd09-449f-939b-cccb4ffe11de\") " Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.649114 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb806bb-fd09-449f-939b-cccb4ffe11de-kube-api-access-h4zp7" (OuterVolumeSpecName: "kube-api-access-h4zp7") pod "beb806bb-fd09-449f-939b-cccb4ffe11de" (UID: "beb806bb-fd09-449f-939b-cccb4ffe11de"). InnerVolumeSpecName "kube-api-access-h4zp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.672053 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "beb806bb-fd09-449f-939b-cccb4ffe11de" (UID: "beb806bb-fd09-449f-939b-cccb4ffe11de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.689170 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-inventory" (OuterVolumeSpecName: "inventory") pod "beb806bb-fd09-449f-939b-cccb4ffe11de" (UID: "beb806bb-fd09-449f-939b-cccb4ffe11de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.747159 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4zp7\" (UniqueName: \"kubernetes.io/projected/beb806bb-fd09-449f-939b-cccb4ffe11de-kube-api-access-h4zp7\") on node \"crc\" DevicePath \"\"" Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.747207 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:51:47 crc kubenswrapper[4672]: I0930 12:51:47.747228 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/beb806bb-fd09-449f-939b-cccb4ffe11de-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.073811 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" event={"ID":"beb806bb-fd09-449f-939b-cccb4ffe11de","Type":"ContainerDied","Data":"189ee97e66815ae3d4d034a971c84724c982d4e078026ca2f2c20c9d50c44565"} Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.073867 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189ee97e66815ae3d4d034a971c84724c982d4e078026ca2f2c20c9d50c44565" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.073938 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.168193 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5"] Sep 30 12:51:48 crc kubenswrapper[4672]: E0930 12:51:48.168667 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb806bb-fd09-449f-939b-cccb4ffe11de" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.168688 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb806bb-fd09-449f-939b-cccb4ffe11de" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.168904 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb806bb-fd09-449f-939b-cccb4ffe11de" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.169780 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.171706 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.173258 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.173471 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.173787 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.177474 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5"] Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.365757 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.365803 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5vwv\" (UniqueName: \"kubernetes.io/projected/5e4bf526-356e-4b1f-a69e-7da92365808d-kube-api-access-x5vwv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.365851 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.467335 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.467385 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5vwv\" (UniqueName: \"kubernetes.io/projected/5e4bf526-356e-4b1f-a69e-7da92365808d-kube-api-access-x5vwv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.467436 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.471559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.482837 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.486582 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5vwv\" (UniqueName: \"kubernetes.io/projected/5e4bf526-356e-4b1f-a69e-7da92365808d-kube-api-access-x5vwv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-669k5\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:48 crc kubenswrapper[4672]: I0930 12:51:48.510800 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:51:49 crc kubenswrapper[4672]: I0930 12:51:49.072585 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5"] Sep 30 12:51:49 crc kubenswrapper[4672]: W0930 12:51:49.075000 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e4bf526_356e_4b1f_a69e_7da92365808d.slice/crio-80d5bcf13f5c2ee1aea8c57d54fbc653954bc48dd0c155fef250f139eeaabf13 WatchSource:0}: Error finding container 80d5bcf13f5c2ee1aea8c57d54fbc653954bc48dd0c155fef250f139eeaabf13: Status 404 returned error can't find the container with id 80d5bcf13f5c2ee1aea8c57d54fbc653954bc48dd0c155fef250f139eeaabf13 Sep 30 12:51:49 crc kubenswrapper[4672]: I0930 12:51:49.519330 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:51:50 crc kubenswrapper[4672]: I0930 12:51:50.031584 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hmsx"] Sep 30 12:51:50 crc kubenswrapper[4672]: I0930 12:51:50.040970 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8hmsx"] Sep 30 12:51:50 crc kubenswrapper[4672]: I0930 12:51:50.092705 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" event={"ID":"5e4bf526-356e-4b1f-a69e-7da92365808d","Type":"ContainerStarted","Data":"4485cb3748b5a1d1c36398f06df3694bc5550273772330e958a80aae539e54c0"} Sep 30 12:51:50 crc kubenswrapper[4672]: I0930 12:51:50.092977 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" event={"ID":"5e4bf526-356e-4b1f-a69e-7da92365808d","Type":"ContainerStarted","Data":"80d5bcf13f5c2ee1aea8c57d54fbc653954bc48dd0c155fef250f139eeaabf13"} Sep 30 12:51:50 crc kubenswrapper[4672]: I0930 12:51:50.119677 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" podStartSLOduration=1.681894943 podStartE2EDuration="2.119654969s" podCreationTimestamp="2025-09-30 12:51:48 +0000 UTC" firstStartedPulling="2025-09-30 12:51:49.079028045 +0000 UTC m=+1800.348265691" lastFinishedPulling="2025-09-30 12:51:49.516788071 +0000 UTC m=+1800.786025717" observedRunningTime="2025-09-30 12:51:50.105542469 +0000 UTC m=+1801.374780125" watchObservedRunningTime="2025-09-30 12:51:50.119654969 +0000 UTC m=+1801.388892615" Sep 30 12:51:51 crc kubenswrapper[4672]: I0930 12:51:51.428202 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9858e9f6-3e7a-48e8-8557-eabc1ccfada4" path="/var/lib/kubelet/pods/9858e9f6-3e7a-48e8-8557-eabc1ccfada4/volumes" Sep 30 12:51:57 crc kubenswrapper[4672]: I0930 12:51:57.417937 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:51:57 crc kubenswrapper[4672]: E0930 12:51:57.418739 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:52:00 crc kubenswrapper[4672]: I0930 12:52:00.850585 4672 scope.go:117] "RemoveContainer" containerID="7ce3018771e881c787448cfb8aa5e4c68216ae246eeedac5ef5d4dbf1d8c4f22" Sep 30 12:52:00 crc kubenswrapper[4672]: I0930 12:52:00.925971 4672 scope.go:117] "RemoveContainer" containerID="8870f5bddbde13fa494016a4544314123d701c79bf980e4762681b821dc4225b" Sep 30 12:52:00 crc kubenswrapper[4672]: I0930 12:52:00.958931 4672 scope.go:117] "RemoveContainer" containerID="31203d1a1f3ea529a7e5cf481e28fe0ad9c2807c8193e97f31a80a9df1538278" Sep 30 12:52:09 crc kubenswrapper[4672]: I0930 12:52:09.422985 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:52:09 crc kubenswrapper[4672]: E0930 12:52:09.424124 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:52:20 crc kubenswrapper[4672]: I0930 12:52:20.418375 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:52:20 crc kubenswrapper[4672]: E0930 12:52:20.419202 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:52:29 crc kubenswrapper[4672]: I0930 12:52:29.064805 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4wwh"] Sep 30 12:52:29 crc kubenswrapper[4672]: I0930 12:52:29.079743 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4wwh"] Sep 30 12:52:29 crc kubenswrapper[4672]: I0930 12:52:29.428643 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf258f0-0d20-49a2-8a89-dd52cfdda97e" path="/var/lib/kubelet/pods/abf258f0-0d20-49a2-8a89-dd52cfdda97e/volumes" Sep 30 12:52:31 crc kubenswrapper[4672]: I0930 12:52:31.534477 4672 generic.go:334] "Generic (PLEG): container finished" podID="5e4bf526-356e-4b1f-a69e-7da92365808d" containerID="4485cb3748b5a1d1c36398f06df3694bc5550273772330e958a80aae539e54c0" exitCode=0 Sep 30 12:52:31 crc kubenswrapper[4672]: I0930 12:52:31.534593 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" event={"ID":"5e4bf526-356e-4b1f-a69e-7da92365808d","Type":"ContainerDied","Data":"4485cb3748b5a1d1c36398f06df3694bc5550273772330e958a80aae539e54c0"} Sep 30 12:52:32 crc kubenswrapper[4672]: I0930 12:52:32.956878 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.071311 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-ssh-key\") pod \"5e4bf526-356e-4b1f-a69e-7da92365808d\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.071602 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5vwv\" (UniqueName: \"kubernetes.io/projected/5e4bf526-356e-4b1f-a69e-7da92365808d-kube-api-access-x5vwv\") pod \"5e4bf526-356e-4b1f-a69e-7da92365808d\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.071700 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-inventory\") pod \"5e4bf526-356e-4b1f-a69e-7da92365808d\" (UID: \"5e4bf526-356e-4b1f-a69e-7da92365808d\") " Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.083650 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4bf526-356e-4b1f-a69e-7da92365808d-kube-api-access-x5vwv" (OuterVolumeSpecName: "kube-api-access-x5vwv") pod "5e4bf526-356e-4b1f-a69e-7da92365808d" (UID: "5e4bf526-356e-4b1f-a69e-7da92365808d"). InnerVolumeSpecName "kube-api-access-x5vwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.103897 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e4bf526-356e-4b1f-a69e-7da92365808d" (UID: "5e4bf526-356e-4b1f-a69e-7da92365808d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.112456 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-inventory" (OuterVolumeSpecName: "inventory") pod "5e4bf526-356e-4b1f-a69e-7da92365808d" (UID: "5e4bf526-356e-4b1f-a69e-7da92365808d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.174046 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5vwv\" (UniqueName: \"kubernetes.io/projected/5e4bf526-356e-4b1f-a69e-7da92365808d-kube-api-access-x5vwv\") on node \"crc\" DevicePath \"\"" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.174075 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.174088 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e4bf526-356e-4b1f-a69e-7da92365808d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.562639 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" event={"ID":"5e4bf526-356e-4b1f-a69e-7da92365808d","Type":"ContainerDied","Data":"80d5bcf13f5c2ee1aea8c57d54fbc653954bc48dd0c155fef250f139eeaabf13"} Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.562689 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80d5bcf13f5c2ee1aea8c57d54fbc653954bc48dd0c155fef250f139eeaabf13" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.562753 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-669k5" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.652196 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm"] Sep 30 12:52:33 crc kubenswrapper[4672]: E0930 12:52:33.652954 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4bf526-356e-4b1f-a69e-7da92365808d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.652979 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4bf526-356e-4b1f-a69e-7da92365808d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.657718 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4bf526-356e-4b1f-a69e-7da92365808d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.660742 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.663767 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.664329 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.664623 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.684154 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.685273 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dznp\" (UniqueName: \"kubernetes.io/projected/847b8779-d63c-4bbb-9b51-94a2c102e36d-kube-api-access-5dznp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.685408 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.685593 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.690350 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm"] Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.788111 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dznp\" (UniqueName: \"kubernetes.io/projected/847b8779-d63c-4bbb-9b51-94a2c102e36d-kube-api-access-5dznp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.788182 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.788250 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.793310 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.793336 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.804077 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dznp\" (UniqueName: \"kubernetes.io/projected/847b8779-d63c-4bbb-9b51-94a2c102e36d-kube-api-access-5dznp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-29qlm\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:33 crc kubenswrapper[4672]: I0930 12:52:33.978635 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:52:34 crc kubenswrapper[4672]: I0930 12:52:34.417878 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:52:34 crc kubenswrapper[4672]: E0930 12:52:34.418839 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:52:34 crc kubenswrapper[4672]: I0930 12:52:34.528365 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm"] Sep 30 12:52:34 crc kubenswrapper[4672]: I0930 12:52:34.572685 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" event={"ID":"847b8779-d63c-4bbb-9b51-94a2c102e36d","Type":"ContainerStarted","Data":"87c22eea664896d684b223c51d3c23db655c26e7af162d1a1a4f7f762a1f0658"} Sep 30 12:52:35 crc kubenswrapper[4672]: I0930 12:52:35.583935 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" event={"ID":"847b8779-d63c-4bbb-9b51-94a2c102e36d","Type":"ContainerStarted","Data":"942f72563aa5e225793923f25691bd155f603781ab7af5ce98980fa09ac21973"} Sep 30 12:52:35 crc kubenswrapper[4672]: I0930 12:52:35.612820 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" podStartSLOduration=2.181633665 podStartE2EDuration="2.612791554s" podCreationTimestamp="2025-09-30 12:52:33 +0000 UTC" firstStartedPulling="2025-09-30 12:52:34.536242262 +0000 UTC m=+1845.805479938" lastFinishedPulling="2025-09-30 12:52:34.967400181 +0000 UTC m=+1846.236637827" observedRunningTime="2025-09-30 12:52:35.603717062 +0000 UTC m=+1846.872954728" watchObservedRunningTime="2025-09-30 12:52:35.612791554 +0000 UTC m=+1846.882029230" Sep 30 12:52:48 crc kubenswrapper[4672]: I0930 12:52:48.417703 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:52:48 crc kubenswrapper[4672]: E0930 12:52:48.419769 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:53:01 crc kubenswrapper[4672]: I0930 12:53:01.112139 4672 scope.go:117] "RemoveContainer" containerID="14803b06d879f0811e4ee0ec40174d5baed3839d680e7c6d0030e652fec94f14" Sep 30 12:53:03 crc kubenswrapper[4672]: I0930 12:53:03.418688 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:53:03 crc kubenswrapper[4672]: E0930 12:53:03.419637 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:53:16 crc kubenswrapper[4672]: I0930 12:53:16.418013 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:53:16 crc kubenswrapper[4672]: E0930 12:53:16.418872 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:53:29 crc kubenswrapper[4672]: I0930 12:53:29.425429 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:53:29 crc kubenswrapper[4672]: E0930 12:53:29.426442 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:53:32 crc kubenswrapper[4672]: I0930 12:53:32.143070 4672 generic.go:334] "Generic (PLEG): container finished" podID="847b8779-d63c-4bbb-9b51-94a2c102e36d" containerID="942f72563aa5e225793923f25691bd155f603781ab7af5ce98980fa09ac21973" exitCode=2 Sep 30 12:53:32 crc kubenswrapper[4672]: I0930 12:53:32.143389 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" event={"ID":"847b8779-d63c-4bbb-9b51-94a2c102e36d","Type":"ContainerDied","Data":"942f72563aa5e225793923f25691bd155f603781ab7af5ce98980fa09ac21973"} Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.570773 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.704083 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dznp\" (UniqueName: \"kubernetes.io/projected/847b8779-d63c-4bbb-9b51-94a2c102e36d-kube-api-access-5dznp\") pod \"847b8779-d63c-4bbb-9b51-94a2c102e36d\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.704467 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-ssh-key\") pod \"847b8779-d63c-4bbb-9b51-94a2c102e36d\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.704541 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-inventory\") pod \"847b8779-d63c-4bbb-9b51-94a2c102e36d\" (UID: \"847b8779-d63c-4bbb-9b51-94a2c102e36d\") " Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.709931 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847b8779-d63c-4bbb-9b51-94a2c102e36d-kube-api-access-5dznp" (OuterVolumeSpecName: "kube-api-access-5dznp") pod "847b8779-d63c-4bbb-9b51-94a2c102e36d" (UID: "847b8779-d63c-4bbb-9b51-94a2c102e36d"). InnerVolumeSpecName "kube-api-access-5dznp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.737667 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-inventory" (OuterVolumeSpecName: "inventory") pod "847b8779-d63c-4bbb-9b51-94a2c102e36d" (UID: "847b8779-d63c-4bbb-9b51-94a2c102e36d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.742881 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "847b8779-d63c-4bbb-9b51-94a2c102e36d" (UID: "847b8779-d63c-4bbb-9b51-94a2c102e36d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.807505 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.807552 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/847b8779-d63c-4bbb-9b51-94a2c102e36d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:53:33 crc kubenswrapper[4672]: I0930 12:53:33.807567 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dznp\" (UniqueName: \"kubernetes.io/projected/847b8779-d63c-4bbb-9b51-94a2c102e36d-kube-api-access-5dznp\") on node \"crc\" DevicePath \"\"" Sep 30 12:53:34 crc kubenswrapper[4672]: I0930 12:53:34.163537 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" event={"ID":"847b8779-d63c-4bbb-9b51-94a2c102e36d","Type":"ContainerDied","Data":"87c22eea664896d684b223c51d3c23db655c26e7af162d1a1a4f7f762a1f0658"} Sep 30 12:53:34 crc kubenswrapper[4672]: I0930 12:53:34.163578 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87c22eea664896d684b223c51d3c23db655c26e7af162d1a1a4f7f762a1f0658" Sep 30 12:53:34 crc kubenswrapper[4672]: I0930 12:53:34.163618 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-29qlm" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.035173 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd"] Sep 30 12:53:41 crc kubenswrapper[4672]: E0930 12:53:41.036196 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847b8779-d63c-4bbb-9b51-94a2c102e36d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.036210 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="847b8779-d63c-4bbb-9b51-94a2c102e36d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.036457 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="847b8779-d63c-4bbb-9b51-94a2c102e36d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.037126 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.039490 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.043877 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.044050 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.045783 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.053181 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf7mw\" (UniqueName: \"kubernetes.io/projected/29628904-dd3c-4ce7-a114-552159673def-kube-api-access-sf7mw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.053224 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.053362 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.062643 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd"] Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.156521 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.156731 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf7mw\" (UniqueName: \"kubernetes.io/projected/29628904-dd3c-4ce7-a114-552159673def-kube-api-access-sf7mw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.156780 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.165322 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.165816 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.176173 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf7mw\" (UniqueName: \"kubernetes.io/projected/29628904-dd3c-4ce7-a114-552159673def-kube-api-access-sf7mw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.358480 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.423749 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:53:41 crc kubenswrapper[4672]: E0930 12:53:41.424275 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.942062 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd"] Sep 30 12:53:41 crc kubenswrapper[4672]: I0930 12:53:41.952031 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 12:53:42 crc kubenswrapper[4672]: I0930 12:53:42.243012 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" event={"ID":"29628904-dd3c-4ce7-a114-552159673def","Type":"ContainerStarted","Data":"fd54c176fa1e406c62b732512f447c23194b85875669221104afede7c44ee572"} Sep 30 12:53:43 crc kubenswrapper[4672]: I0930 12:53:43.265066 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" event={"ID":"29628904-dd3c-4ce7-a114-552159673def","Type":"ContainerStarted","Data":"447147a455fb8ecfca6f1151c72c2ebeda08cb2947190b137f4f79c5c50d60fd"} Sep 30 12:53:43 crc kubenswrapper[4672]: I0930 12:53:43.297623 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" podStartSLOduration=1.788835044 podStartE2EDuration="2.297599885s" podCreationTimestamp="2025-09-30 12:53:41 +0000 UTC" firstStartedPulling="2025-09-30 12:53:41.951319949 +0000 UTC m=+1913.220557615" lastFinishedPulling="2025-09-30 12:53:42.4600848 +0000 UTC m=+1913.729322456" observedRunningTime="2025-09-30 12:53:43.281970767 +0000 UTC m=+1914.551208433" watchObservedRunningTime="2025-09-30 12:53:43.297599885 +0000 UTC m=+1914.566837541" Sep 30 12:53:55 crc kubenswrapper[4672]: I0930 12:53:55.417830 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:53:55 crc kubenswrapper[4672]: E0930 12:53:55.418935 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:54:08 crc kubenswrapper[4672]: I0930 12:54:08.417752 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:54:08 crc kubenswrapper[4672]: E0930 12:54:08.418740 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:54:23 crc kubenswrapper[4672]: I0930 12:54:23.417371 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:54:23 crc kubenswrapper[4672]: E0930 12:54:23.418110 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 12:54:31 crc kubenswrapper[4672]: I0930 12:54:31.737787 4672 generic.go:334] "Generic (PLEG): container finished" podID="29628904-dd3c-4ce7-a114-552159673def" containerID="447147a455fb8ecfca6f1151c72c2ebeda08cb2947190b137f4f79c5c50d60fd" exitCode=0 Sep 30 12:54:31 crc kubenswrapper[4672]: I0930 12:54:31.738331 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" event={"ID":"29628904-dd3c-4ce7-a114-552159673def","Type":"ContainerDied","Data":"447147a455fb8ecfca6f1151c72c2ebeda08cb2947190b137f4f79c5c50d60fd"} Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.201620 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.326161 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-inventory\") pod \"29628904-dd3c-4ce7-a114-552159673def\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.326309 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-ssh-key\") pod \"29628904-dd3c-4ce7-a114-552159673def\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.326369 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf7mw\" (UniqueName: \"kubernetes.io/projected/29628904-dd3c-4ce7-a114-552159673def-kube-api-access-sf7mw\") pod \"29628904-dd3c-4ce7-a114-552159673def\" (UID: \"29628904-dd3c-4ce7-a114-552159673def\") " Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.331831 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29628904-dd3c-4ce7-a114-552159673def-kube-api-access-sf7mw" (OuterVolumeSpecName: "kube-api-access-sf7mw") pod "29628904-dd3c-4ce7-a114-552159673def" (UID: "29628904-dd3c-4ce7-a114-552159673def"). InnerVolumeSpecName "kube-api-access-sf7mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.359138 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29628904-dd3c-4ce7-a114-552159673def" (UID: "29628904-dd3c-4ce7-a114-552159673def"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.378062 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-inventory" (OuterVolumeSpecName: "inventory") pod "29628904-dd3c-4ce7-a114-552159673def" (UID: "29628904-dd3c-4ce7-a114-552159673def"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.428311 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.428350 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf7mw\" (UniqueName: \"kubernetes.io/projected/29628904-dd3c-4ce7-a114-552159673def-kube-api-access-sf7mw\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.428397 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29628904-dd3c-4ce7-a114-552159673def-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.763592 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" event={"ID":"29628904-dd3c-4ce7-a114-552159673def","Type":"ContainerDied","Data":"fd54c176fa1e406c62b732512f447c23194b85875669221104afede7c44ee572"} Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.763635 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd54c176fa1e406c62b732512f447c23194b85875669221104afede7c44ee572" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.763739 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.852277 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9csw2"] Sep 30 12:54:33 crc kubenswrapper[4672]: E0930 12:54:33.852834 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29628904-dd3c-4ce7-a114-552159673def" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.852859 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="29628904-dd3c-4ce7-a114-552159673def" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.853131 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="29628904-dd3c-4ce7-a114-552159673def" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.853957 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.859144 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.859197 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.859701 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.859853 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.863527 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9csw2"] Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.938691 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.938751 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzjv\" (UniqueName: \"kubernetes.io/projected/c303b53d-3c71-498b-99fb-432610f75b61-kube-api-access-6mzjv\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:33 crc kubenswrapper[4672]: I0930 12:54:33.938781 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.041604 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.041690 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzjv\" (UniqueName: \"kubernetes.io/projected/c303b53d-3c71-498b-99fb-432610f75b61-kube-api-access-6mzjv\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.041833 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.047199 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.048822 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.066137 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzjv\" (UniqueName: \"kubernetes.io/projected/c303b53d-3c71-498b-99fb-432610f75b61-kube-api-access-6mzjv\") pod \"ssh-known-hosts-edpm-deployment-9csw2\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.171192 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:34 crc kubenswrapper[4672]: I0930 12:54:34.794457 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9csw2"] Sep 30 12:54:35 crc kubenswrapper[4672]: I0930 12:54:35.787252 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" event={"ID":"c303b53d-3c71-498b-99fb-432610f75b61","Type":"ContainerStarted","Data":"f5269325fa80709354369235615f6be97a3bf6dc0b15dcff28ba23b343c14cc0"} Sep 30 12:54:35 crc kubenswrapper[4672]: I0930 12:54:35.787639 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" event={"ID":"c303b53d-3c71-498b-99fb-432610f75b61","Type":"ContainerStarted","Data":"385457611a1686850269849979a94456e47c388411370c612f8b74fbcc6045b8"} Sep 30 12:54:35 crc kubenswrapper[4672]: I0930 12:54:35.808553 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" podStartSLOduration=2.393658198 podStartE2EDuration="2.808536767s" podCreationTimestamp="2025-09-30 12:54:33 +0000 UTC" firstStartedPulling="2025-09-30 12:54:34.798768393 +0000 UTC m=+1966.068006039" lastFinishedPulling="2025-09-30 12:54:35.213646932 +0000 UTC m=+1966.482884608" observedRunningTime="2025-09-30 12:54:35.800911162 +0000 UTC m=+1967.070148828" watchObservedRunningTime="2025-09-30 12:54:35.808536767 +0000 UTC m=+1967.077774413" Sep 30 12:54:38 crc kubenswrapper[4672]: I0930 12:54:38.416729 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:54:38 crc kubenswrapper[4672]: I0930 12:54:38.825540 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"c2787254d23886a5d5ca32405d4feb4336d5e1be3c00953ebe36b2065fda6f64"} Sep 30 12:54:42 crc kubenswrapper[4672]: I0930 12:54:42.866517 4672 generic.go:334] "Generic (PLEG): container finished" podID="c303b53d-3c71-498b-99fb-432610f75b61" containerID="f5269325fa80709354369235615f6be97a3bf6dc0b15dcff28ba23b343c14cc0" exitCode=0 Sep 30 12:54:42 crc kubenswrapper[4672]: I0930 12:54:42.866657 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" event={"ID":"c303b53d-3c71-498b-99fb-432610f75b61","Type":"ContainerDied","Data":"f5269325fa80709354369235615f6be97a3bf6dc0b15dcff28ba23b343c14cc0"} Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.331018 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.473966 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-inventory-0\") pod \"c303b53d-3c71-498b-99fb-432610f75b61\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.474149 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-ssh-key-openstack-edpm-ipam\") pod \"c303b53d-3c71-498b-99fb-432610f75b61\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.474204 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mzjv\" (UniqueName: \"kubernetes.io/projected/c303b53d-3c71-498b-99fb-432610f75b61-kube-api-access-6mzjv\") pod \"c303b53d-3c71-498b-99fb-432610f75b61\" (UID: \"c303b53d-3c71-498b-99fb-432610f75b61\") " Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.480927 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c303b53d-3c71-498b-99fb-432610f75b61-kube-api-access-6mzjv" (OuterVolumeSpecName: "kube-api-access-6mzjv") pod "c303b53d-3c71-498b-99fb-432610f75b61" (UID: "c303b53d-3c71-498b-99fb-432610f75b61"). InnerVolumeSpecName "kube-api-access-6mzjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.505032 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c303b53d-3c71-498b-99fb-432610f75b61" (UID: "c303b53d-3c71-498b-99fb-432610f75b61"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.532787 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c303b53d-3c71-498b-99fb-432610f75b61" (UID: "c303b53d-3c71-498b-99fb-432610f75b61"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.576538 4672 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.576582 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c303b53d-3c71-498b-99fb-432610f75b61-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.576601 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mzjv\" (UniqueName: \"kubernetes.io/projected/c303b53d-3c71-498b-99fb-432610f75b61-kube-api-access-6mzjv\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.891031 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" event={"ID":"c303b53d-3c71-498b-99fb-432610f75b61","Type":"ContainerDied","Data":"385457611a1686850269849979a94456e47c388411370c612f8b74fbcc6045b8"} Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.891094 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385457611a1686850269849979a94456e47c388411370c612f8b74fbcc6045b8" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.891506 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9csw2" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.991073 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6"] Sep 30 12:54:44 crc kubenswrapper[4672]: E0930 12:54:44.992988 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c303b53d-3c71-498b-99fb-432610f75b61" containerName="ssh-known-hosts-edpm-deployment" Sep 30 12:54:44 crc kubenswrapper[4672]: I0930 12:54:44.993222 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c303b53d-3c71-498b-99fb-432610f75b61" containerName="ssh-known-hosts-edpm-deployment" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.004559 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c303b53d-3c71-498b-99fb-432610f75b61" containerName="ssh-known-hosts-edpm-deployment" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.013066 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.015928 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6"] Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.018398 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.018624 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.025752 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.034990 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.085662 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.085785 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.085819 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2x2s\" (UniqueName: \"kubernetes.io/projected/754f9f15-2c0e-4279-aec1-589d1b23eb75-kube-api-access-w2x2s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.188057 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.188111 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2x2s\" (UniqueName: \"kubernetes.io/projected/754f9f15-2c0e-4279-aec1-589d1b23eb75-kube-api-access-w2x2s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.188314 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.193624 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.194954 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.211392 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2x2s\" (UniqueName: \"kubernetes.io/projected/754f9f15-2c0e-4279-aec1-589d1b23eb75-kube-api-access-w2x2s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bh9j6\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.340038 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.846407 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6"] Sep 30 12:54:45 crc kubenswrapper[4672]: I0930 12:54:45.899892 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" event={"ID":"754f9f15-2c0e-4279-aec1-589d1b23eb75","Type":"ContainerStarted","Data":"234c3aa72c70537d0d5b750c8326dab36a6763047ea99be39b5fa75d443dc2b4"} Sep 30 12:54:46 crc kubenswrapper[4672]: I0930 12:54:46.910161 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" event={"ID":"754f9f15-2c0e-4279-aec1-589d1b23eb75","Type":"ContainerStarted","Data":"a9e63f03b2c2bd81dd21e366197e347ba9809342f5e5663527e06acaa9f5d7ee"} Sep 30 12:54:46 crc kubenswrapper[4672]: I0930 12:54:46.936308 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" podStartSLOduration=2.139916517 podStartE2EDuration="2.936286285s" podCreationTimestamp="2025-09-30 12:54:44 +0000 UTC" firstStartedPulling="2025-09-30 12:54:45.854193298 +0000 UTC m=+1977.123430944" lastFinishedPulling="2025-09-30 12:54:46.650563036 +0000 UTC m=+1977.919800712" observedRunningTime="2025-09-30 12:54:46.930540238 +0000 UTC m=+1978.199777914" watchObservedRunningTime="2025-09-30 12:54:46.936286285 +0000 UTC m=+1978.205523941" Sep 30 12:54:56 crc kubenswrapper[4672]: I0930 12:54:56.012342 4672 generic.go:334] "Generic (PLEG): container finished" podID="754f9f15-2c0e-4279-aec1-589d1b23eb75" containerID="a9e63f03b2c2bd81dd21e366197e347ba9809342f5e5663527e06acaa9f5d7ee" exitCode=0 Sep 30 12:54:56 crc kubenswrapper[4672]: I0930 12:54:56.012465 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" event={"ID":"754f9f15-2c0e-4279-aec1-589d1b23eb75","Type":"ContainerDied","Data":"a9e63f03b2c2bd81dd21e366197e347ba9809342f5e5663527e06acaa9f5d7ee"} Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.481680 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.540753 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-inventory\") pod \"754f9f15-2c0e-4279-aec1-589d1b23eb75\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.540856 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2x2s\" (UniqueName: \"kubernetes.io/projected/754f9f15-2c0e-4279-aec1-589d1b23eb75-kube-api-access-w2x2s\") pod \"754f9f15-2c0e-4279-aec1-589d1b23eb75\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.540946 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-ssh-key\") pod \"754f9f15-2c0e-4279-aec1-589d1b23eb75\" (UID: \"754f9f15-2c0e-4279-aec1-589d1b23eb75\") " Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.547473 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754f9f15-2c0e-4279-aec1-589d1b23eb75-kube-api-access-w2x2s" (OuterVolumeSpecName: "kube-api-access-w2x2s") pod "754f9f15-2c0e-4279-aec1-589d1b23eb75" (UID: "754f9f15-2c0e-4279-aec1-589d1b23eb75"). InnerVolumeSpecName "kube-api-access-w2x2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.566544 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-inventory" (OuterVolumeSpecName: "inventory") pod "754f9f15-2c0e-4279-aec1-589d1b23eb75" (UID: "754f9f15-2c0e-4279-aec1-589d1b23eb75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.571209 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "754f9f15-2c0e-4279-aec1-589d1b23eb75" (UID: "754f9f15-2c0e-4279-aec1-589d1b23eb75"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.644061 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.644096 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2x2s\" (UniqueName: \"kubernetes.io/projected/754f9f15-2c0e-4279-aec1-589d1b23eb75-kube-api-access-w2x2s\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:57 crc kubenswrapper[4672]: I0930 12:54:57.644108 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/754f9f15-2c0e-4279-aec1-589d1b23eb75-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.038223 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" event={"ID":"754f9f15-2c0e-4279-aec1-589d1b23eb75","Type":"ContainerDied","Data":"234c3aa72c70537d0d5b750c8326dab36a6763047ea99be39b5fa75d443dc2b4"} Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.038275 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234c3aa72c70537d0d5b750c8326dab36a6763047ea99be39b5fa75d443dc2b4" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.038304 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bh9j6" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.115440 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl"] Sep 30 12:54:58 crc kubenswrapper[4672]: E0930 12:54:58.115934 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754f9f15-2c0e-4279-aec1-589d1b23eb75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.115955 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="754f9f15-2c0e-4279-aec1-589d1b23eb75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.116160 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="754f9f15-2c0e-4279-aec1-589d1b23eb75" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.116804 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.118567 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.118620 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.119495 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.124112 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.134149 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl"] Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.153805 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzgz4\" (UniqueName: \"kubernetes.io/projected/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-kube-api-access-jzgz4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.153904 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.153968 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.255951 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.256207 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzgz4\" (UniqueName: \"kubernetes.io/projected/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-kube-api-access-jzgz4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.256301 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.259685 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.263950 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.277369 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzgz4\" (UniqueName: \"kubernetes.io/projected/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-kube-api-access-jzgz4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.432344 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:54:58 crc kubenswrapper[4672]: I0930 12:54:58.943870 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl"] Sep 30 12:54:59 crc kubenswrapper[4672]: I0930 12:54:59.047576 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" event={"ID":"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a","Type":"ContainerStarted","Data":"2f1ce28ff92706dd346e0ad13d7ee5320d10ca08008c0cb5667c1b72e62d3cf8"} Sep 30 12:55:00 crc kubenswrapper[4672]: I0930 12:55:00.056902 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" event={"ID":"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a","Type":"ContainerStarted","Data":"b0770b65547cdcb56e787e43cea89c5bbff1ca572aa551d71cb10a8789177b76"} Sep 30 12:55:00 crc kubenswrapper[4672]: I0930 12:55:00.080873 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" podStartSLOduration=1.6262446499999998 podStartE2EDuration="2.08085096s" podCreationTimestamp="2025-09-30 12:54:58 +0000 UTC" firstStartedPulling="2025-09-30 12:54:58.948380601 +0000 UTC m=+1990.217618247" lastFinishedPulling="2025-09-30 12:54:59.402986901 +0000 UTC m=+1990.672224557" observedRunningTime="2025-09-30 12:55:00.072181619 +0000 UTC m=+1991.341419275" watchObservedRunningTime="2025-09-30 12:55:00.08085096 +0000 UTC m=+1991.350088616" Sep 30 12:55:10 crc kubenswrapper[4672]: I0930 12:55:10.165497 4672 generic.go:334] "Generic (PLEG): container finished" podID="c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a" containerID="b0770b65547cdcb56e787e43cea89c5bbff1ca572aa551d71cb10a8789177b76" exitCode=0 Sep 30 12:55:10 crc kubenswrapper[4672]: I0930 12:55:10.165595 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" event={"ID":"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a","Type":"ContainerDied","Data":"b0770b65547cdcb56e787e43cea89c5bbff1ca572aa551d71cb10a8789177b76"} Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.627083 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.726580 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzgz4\" (UniqueName: \"kubernetes.io/projected/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-kube-api-access-jzgz4\") pod \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.726691 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-ssh-key\") pod \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.726784 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-inventory\") pod \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\" (UID: \"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a\") " Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.732691 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-kube-api-access-jzgz4" (OuterVolumeSpecName: "kube-api-access-jzgz4") pod "c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a" (UID: "c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a"). InnerVolumeSpecName "kube-api-access-jzgz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.754977 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-inventory" (OuterVolumeSpecName: "inventory") pod "c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a" (UID: "c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.768280 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a" (UID: "c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.829101 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzgz4\" (UniqueName: \"kubernetes.io/projected/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-kube-api-access-jzgz4\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.829132 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:11 crc kubenswrapper[4672]: I0930 12:55:11.829143 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.203283 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" event={"ID":"c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a","Type":"ContainerDied","Data":"2f1ce28ff92706dd346e0ad13d7ee5320d10ca08008c0cb5667c1b72e62d3cf8"} Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.203677 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1ce28ff92706dd346e0ad13d7ee5320d10ca08008c0cb5667c1b72e62d3cf8" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.203522 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl" Sep 30 12:55:12 crc kubenswrapper[4672]: E0930 12:55:12.286781 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3a5c5b9_3aa7_4fef_9450_0e82ab08bb3a.slice\": RecentStats: unable to find data in memory cache]" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.309141 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69"] Sep 30 12:55:12 crc kubenswrapper[4672]: E0930 12:55:12.309557 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.309573 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.310587 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.311243 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.313565 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.314788 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.315071 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.315197 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.315341 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.315480 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.315938 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.316696 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.330581 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69"] Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444294 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7xw\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-kube-api-access-lw7xw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444344 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444376 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444400 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444435 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444456 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444657 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444722 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444778 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.444846 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.445021 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.445052 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.445116 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.445157 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.546546 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.546683 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548421 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548481 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548504 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548578 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7xw\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-kube-api-access-lw7xw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548643 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548683 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548702 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548752 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548781 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548844 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548869 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.548893 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.554062 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.555097 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.556137 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.556507 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.556817 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.557789 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.557820 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.558903 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.559165 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.561943 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.562003 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.563501 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.564549 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.572674 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7xw\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-kube-api-access-lw7xw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qjc69\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:12 crc kubenswrapper[4672]: I0930 12:55:12.677421 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:13 crc kubenswrapper[4672]: I0930 12:55:13.185504 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69"] Sep 30 12:55:13 crc kubenswrapper[4672]: I0930 12:55:13.213402 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" event={"ID":"e44e3b27-d209-49da-93b2-ed646da0650e","Type":"ContainerStarted","Data":"34ef0ddb6540d97053f82db5b7deb3492a867cb2d8800024aa0e07c4bda95234"} Sep 30 12:55:14 crc kubenswrapper[4672]: I0930 12:55:14.224024 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" event={"ID":"e44e3b27-d209-49da-93b2-ed646da0650e","Type":"ContainerStarted","Data":"e382b4cc2a3939e8d9119020117ec2dba1cdf4a1080fc061d3c1ae5df35dcf16"} Sep 30 12:55:14 crc kubenswrapper[4672]: I0930 12:55:14.249227 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" podStartSLOduration=1.6670268639999999 podStartE2EDuration="2.249210734s" podCreationTimestamp="2025-09-30 12:55:12 +0000 UTC" firstStartedPulling="2025-09-30 12:55:13.194944998 +0000 UTC m=+2004.464182654" lastFinishedPulling="2025-09-30 12:55:13.777128848 +0000 UTC m=+2005.046366524" observedRunningTime="2025-09-30 12:55:14.246655179 +0000 UTC m=+2005.515892825" watchObservedRunningTime="2025-09-30 12:55:14.249210734 +0000 UTC m=+2005.518448380" Sep 30 12:55:55 crc kubenswrapper[4672]: I0930 12:55:55.710585 4672 generic.go:334] "Generic (PLEG): container finished" podID="e44e3b27-d209-49da-93b2-ed646da0650e" containerID="e382b4cc2a3939e8d9119020117ec2dba1cdf4a1080fc061d3c1ae5df35dcf16" exitCode=0 Sep 30 12:55:55 crc kubenswrapper[4672]: I0930 12:55:55.711336 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" event={"ID":"e44e3b27-d209-49da-93b2-ed646da0650e","Type":"ContainerDied","Data":"e382b4cc2a3939e8d9119020117ec2dba1cdf4a1080fc061d3c1ae5df35dcf16"} Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.208672 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386046 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386453 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386479 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-libvirt-combined-ca-bundle\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386506 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-bootstrap-combined-ca-bundle\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386539 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386606 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-telemetry-combined-ca-bundle\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386626 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ssh-key\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386665 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-neutron-metadata-combined-ca-bundle\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386731 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386764 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-nova-combined-ca-bundle\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386807 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw7xw\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-kube-api-access-lw7xw\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386845 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ovn-combined-ca-bundle\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386906 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-repo-setup-combined-ca-bundle\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.386941 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-inventory\") pod \"e44e3b27-d209-49da-93b2-ed646da0650e\" (UID: \"e44e3b27-d209-49da-93b2-ed646da0650e\") " Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.393591 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.394604 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.395148 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.395316 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.395401 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-kube-api-access-lw7xw" (OuterVolumeSpecName: "kube-api-access-lw7xw") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "kube-api-access-lw7xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.396642 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.396647 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.397814 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.398692 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.399419 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.400183 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.406841 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.425166 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.428972 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-inventory" (OuterVolumeSpecName: "inventory") pod "e44e3b27-d209-49da-93b2-ed646da0650e" (UID: "e44e3b27-d209-49da-93b2-ed646da0650e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497376 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw7xw\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-kube-api-access-lw7xw\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497417 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497430 4672 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497443 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497457 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497471 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497485 4672 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497503 4672 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497518 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497535 4672 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497547 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497561 4672 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497573 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e44e3b27-d209-49da-93b2-ed646da0650e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.497585 4672 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44e3b27-d209-49da-93b2-ed646da0650e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.736546 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" event={"ID":"e44e3b27-d209-49da-93b2-ed646da0650e","Type":"ContainerDied","Data":"34ef0ddb6540d97053f82db5b7deb3492a867cb2d8800024aa0e07c4bda95234"} Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.736876 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34ef0ddb6540d97053f82db5b7deb3492a867cb2d8800024aa0e07c4bda95234" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.736774 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qjc69" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.823851 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw"] Sep 30 12:55:57 crc kubenswrapper[4672]: E0930 12:55:57.824232 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e44e3b27-d209-49da-93b2-ed646da0650e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.824251 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e44e3b27-d209-49da-93b2-ed646da0650e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.824483 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e44e3b27-d209-49da-93b2-ed646da0650e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.825116 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.827614 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.827697 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.827738 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.828516 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.828868 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.841429 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw"] Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.905150 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm5pv\" (UniqueName: \"kubernetes.io/projected/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-kube-api-access-dm5pv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.905278 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.905328 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.905558 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:57 crc kubenswrapper[4672]: I0930 12:55:57.905676 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.007568 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm5pv\" (UniqueName: \"kubernetes.io/projected/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-kube-api-access-dm5pv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.007685 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.007744 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.007816 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.007857 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.008775 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.013195 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.013433 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.013897 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.037790 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm5pv\" (UniqueName: \"kubernetes.io/projected/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-kube-api-access-dm5pv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mfrgw\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.142381 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.710752 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw"] Sep 30 12:55:58 crc kubenswrapper[4672]: W0930 12:55:58.722666 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde8ad7f9_8f0f_46fb_8649_9dd7bb8abc16.slice/crio-2f1c7abb862431d317e3cc77807bfb45fe9c6420a80393abd46ae5737992bcb9 WatchSource:0}: Error finding container 2f1c7abb862431d317e3cc77807bfb45fe9c6420a80393abd46ae5737992bcb9: Status 404 returned error can't find the container with id 2f1c7abb862431d317e3cc77807bfb45fe9c6420a80393abd46ae5737992bcb9 Sep 30 12:55:58 crc kubenswrapper[4672]: I0930 12:55:58.746570 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" event={"ID":"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16","Type":"ContainerStarted","Data":"2f1c7abb862431d317e3cc77807bfb45fe9c6420a80393abd46ae5737992bcb9"} Sep 30 12:55:59 crc kubenswrapper[4672]: I0930 12:55:59.755528 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" event={"ID":"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16","Type":"ContainerStarted","Data":"4fb9dbbee8c139194d01ba7d7d7a49c3dfd741807aad680debfaee968738c285"} Sep 30 12:55:59 crc kubenswrapper[4672]: I0930 12:55:59.775543 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" podStartSLOduration=2.137074192 podStartE2EDuration="2.775523467s" podCreationTimestamp="2025-09-30 12:55:57 +0000 UTC" firstStartedPulling="2025-09-30 12:55:58.724672376 +0000 UTC m=+2049.993910032" lastFinishedPulling="2025-09-30 12:55:59.363121661 +0000 UTC m=+2050.632359307" observedRunningTime="2025-09-30 12:55:59.771057853 +0000 UTC m=+2051.040295499" watchObservedRunningTime="2025-09-30 12:55:59.775523467 +0000 UTC m=+2051.044761123" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.638061 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7jsj2"] Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.642694 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.657549 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7jsj2"] Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.699498 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npv5x\" (UniqueName: \"kubernetes.io/projected/b08ef310-99b2-49b9-965a-8a9cea33d97d-kube-api-access-npv5x\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.699556 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-utilities\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.699634 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-catalog-content\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.801669 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-catalog-content\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.801851 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npv5x\" (UniqueName: \"kubernetes.io/projected/b08ef310-99b2-49b9-965a-8a9cea33d97d-kube-api-access-npv5x\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.801905 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-utilities\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.802474 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-utilities\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.802694 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-catalog-content\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.824560 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npv5x\" (UniqueName: \"kubernetes.io/projected/b08ef310-99b2-49b9-965a-8a9cea33d97d-kube-api-access-npv5x\") pod \"certified-operators-7jsj2\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:17 crc kubenswrapper[4672]: I0930 12:56:17.990423 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:18 crc kubenswrapper[4672]: I0930 12:56:18.513931 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7jsj2"] Sep 30 12:56:18 crc kubenswrapper[4672]: I0930 12:56:18.962672 4672 generic.go:334] "Generic (PLEG): container finished" podID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerID="62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df" exitCode=0 Sep 30 12:56:18 crc kubenswrapper[4672]: I0930 12:56:18.962740 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jsj2" event={"ID":"b08ef310-99b2-49b9-965a-8a9cea33d97d","Type":"ContainerDied","Data":"62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df"} Sep 30 12:56:18 crc kubenswrapper[4672]: I0930 12:56:18.963036 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jsj2" event={"ID":"b08ef310-99b2-49b9-965a-8a9cea33d97d","Type":"ContainerStarted","Data":"df355bc266fe2d57145b44a7d5d65c0934221f13a2115b639cfcb9a3aed51cb6"} Sep 30 12:56:19 crc kubenswrapper[4672]: I0930 12:56:19.976417 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jsj2" event={"ID":"b08ef310-99b2-49b9-965a-8a9cea33d97d","Type":"ContainerStarted","Data":"24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452"} Sep 30 12:56:20 crc kubenswrapper[4672]: I0930 12:56:20.985864 4672 generic.go:334] "Generic (PLEG): container finished" podID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerID="24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452" exitCode=0 Sep 30 12:56:20 crc kubenswrapper[4672]: I0930 12:56:20.985935 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jsj2" event={"ID":"b08ef310-99b2-49b9-965a-8a9cea33d97d","Type":"ContainerDied","Data":"24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452"} Sep 30 12:56:21 crc kubenswrapper[4672]: I0930 12:56:21.995767 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jsj2" event={"ID":"b08ef310-99b2-49b9-965a-8a9cea33d97d","Type":"ContainerStarted","Data":"b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed"} Sep 30 12:56:22 crc kubenswrapper[4672]: I0930 12:56:22.017310 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7jsj2" podStartSLOduration=2.542976748 podStartE2EDuration="5.017292521s" podCreationTimestamp="2025-09-30 12:56:17 +0000 UTC" firstStartedPulling="2025-09-30 12:56:18.966890603 +0000 UTC m=+2070.236128279" lastFinishedPulling="2025-09-30 12:56:21.441206406 +0000 UTC m=+2072.710444052" observedRunningTime="2025-09-30 12:56:22.013905955 +0000 UTC m=+2073.283143621" watchObservedRunningTime="2025-09-30 12:56:22.017292521 +0000 UTC m=+2073.286530157" Sep 30 12:56:27 crc kubenswrapper[4672]: I0930 12:56:27.990967 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:27 crc kubenswrapper[4672]: I0930 12:56:27.992406 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:28 crc kubenswrapper[4672]: I0930 12:56:28.045754 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:28 crc kubenswrapper[4672]: I0930 12:56:28.133713 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:28 crc kubenswrapper[4672]: I0930 12:56:28.290399 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7jsj2"] Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.081859 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7jsj2" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="registry-server" containerID="cri-o://b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed" gracePeriod=2 Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.582723 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.724534 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npv5x\" (UniqueName: \"kubernetes.io/projected/b08ef310-99b2-49b9-965a-8a9cea33d97d-kube-api-access-npv5x\") pod \"b08ef310-99b2-49b9-965a-8a9cea33d97d\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.724622 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-utilities\") pod \"b08ef310-99b2-49b9-965a-8a9cea33d97d\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.724854 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-catalog-content\") pod \"b08ef310-99b2-49b9-965a-8a9cea33d97d\" (UID: \"b08ef310-99b2-49b9-965a-8a9cea33d97d\") " Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.726854 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-utilities" (OuterVolumeSpecName: "utilities") pod "b08ef310-99b2-49b9-965a-8a9cea33d97d" (UID: "b08ef310-99b2-49b9-965a-8a9cea33d97d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.730231 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08ef310-99b2-49b9-965a-8a9cea33d97d-kube-api-access-npv5x" (OuterVolumeSpecName: "kube-api-access-npv5x") pod "b08ef310-99b2-49b9-965a-8a9cea33d97d" (UID: "b08ef310-99b2-49b9-965a-8a9cea33d97d"). InnerVolumeSpecName "kube-api-access-npv5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.770500 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b08ef310-99b2-49b9-965a-8a9cea33d97d" (UID: "b08ef310-99b2-49b9-965a-8a9cea33d97d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.826847 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.826892 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npv5x\" (UniqueName: \"kubernetes.io/projected/b08ef310-99b2-49b9-965a-8a9cea33d97d-kube-api-access-npv5x\") on node \"crc\" DevicePath \"\"" Sep 30 12:56:30 crc kubenswrapper[4672]: I0930 12:56:30.826910 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b08ef310-99b2-49b9-965a-8a9cea33d97d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.098631 4672 generic.go:334] "Generic (PLEG): container finished" podID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerID="b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed" exitCode=0 Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.098711 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jsj2" event={"ID":"b08ef310-99b2-49b9-965a-8a9cea33d97d","Type":"ContainerDied","Data":"b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed"} Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.098822 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jsj2" event={"ID":"b08ef310-99b2-49b9-965a-8a9cea33d97d","Type":"ContainerDied","Data":"df355bc266fe2d57145b44a7d5d65c0934221f13a2115b639cfcb9a3aed51cb6"} Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.098857 4672 scope.go:117] "RemoveContainer" containerID="b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.099450 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jsj2" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.150086 4672 scope.go:117] "RemoveContainer" containerID="24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.160286 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7jsj2"] Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.173683 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7jsj2"] Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.179499 4672 scope.go:117] "RemoveContainer" containerID="62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.217876 4672 scope.go:117] "RemoveContainer" containerID="b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed" Sep 30 12:56:31 crc kubenswrapper[4672]: E0930 12:56:31.218367 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed\": container with ID starting with b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed not found: ID does not exist" containerID="b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.218400 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed"} err="failed to get container status \"b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed\": rpc error: code = NotFound desc = could not find container \"b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed\": container with ID starting with b6f11456107da803a9e964da8feb6ddf9d69dbce7d9ab0cd69e7ee4370d4d5ed not found: ID does not exist" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.218422 4672 scope.go:117] "RemoveContainer" containerID="24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452" Sep 30 12:56:31 crc kubenswrapper[4672]: E0930 12:56:31.218740 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452\": container with ID starting with 24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452 not found: ID does not exist" containerID="24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.218775 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452"} err="failed to get container status \"24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452\": rpc error: code = NotFound desc = could not find container \"24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452\": container with ID starting with 24f8380e50b7f8dfe05dd61d9daf13cb005606eac38279806f1ec52250e11452 not found: ID does not exist" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.218793 4672 scope.go:117] "RemoveContainer" containerID="62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df" Sep 30 12:56:31 crc kubenswrapper[4672]: E0930 12:56:31.219045 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df\": container with ID starting with 62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df not found: ID does not exist" containerID="62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.219074 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df"} err="failed to get container status \"62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df\": rpc error: code = NotFound desc = could not find container \"62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df\": container with ID starting with 62e6fe05a98318a0de17f59fe71a40b5e78d3d2d5db01afa2c8da1a71a3ab6df not found: ID does not exist" Sep 30 12:56:31 crc kubenswrapper[4672]: I0930 12:56:31.429226 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" path="/var/lib/kubelet/pods/b08ef310-99b2-49b9-965a-8a9cea33d97d/volumes" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.006909 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v27qr"] Sep 30 12:56:39 crc kubenswrapper[4672]: E0930 12:56:39.008030 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="extract-content" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.008048 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="extract-content" Sep 30 12:56:39 crc kubenswrapper[4672]: E0930 12:56:39.008071 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="registry-server" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.008080 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="registry-server" Sep 30 12:56:39 crc kubenswrapper[4672]: E0930 12:56:39.008109 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="extract-utilities" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.008118 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="extract-utilities" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.008362 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08ef310-99b2-49b9-965a-8a9cea33d97d" containerName="registry-server" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.010437 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.019634 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v27qr"] Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.118196 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-catalog-content\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.118408 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvgb\" (UniqueName: \"kubernetes.io/projected/cee97e22-15e3-47f2-a776-66689a174c11-kube-api-access-4zvgb\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.118448 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-utilities\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.220227 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvgb\" (UniqueName: \"kubernetes.io/projected/cee97e22-15e3-47f2-a776-66689a174c11-kube-api-access-4zvgb\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.220301 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-utilities\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.220418 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-catalog-content\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.220803 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-utilities\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.220855 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-catalog-content\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.240585 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvgb\" (UniqueName: \"kubernetes.io/projected/cee97e22-15e3-47f2-a776-66689a174c11-kube-api-access-4zvgb\") pod \"redhat-operators-v27qr\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.339788 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:39 crc kubenswrapper[4672]: I0930 12:56:39.790417 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v27qr"] Sep 30 12:56:40 crc kubenswrapper[4672]: I0930 12:56:40.221768 4672 generic.go:334] "Generic (PLEG): container finished" podID="cee97e22-15e3-47f2-a776-66689a174c11" containerID="047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db" exitCode=0 Sep 30 12:56:40 crc kubenswrapper[4672]: I0930 12:56:40.221829 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v27qr" event={"ID":"cee97e22-15e3-47f2-a776-66689a174c11","Type":"ContainerDied","Data":"047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db"} Sep 30 12:56:40 crc kubenswrapper[4672]: I0930 12:56:40.221873 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v27qr" event={"ID":"cee97e22-15e3-47f2-a776-66689a174c11","Type":"ContainerStarted","Data":"c8e753587874929a6d01ede8926c26265d637909210705a07f6efe7dcfb80fe4"} Sep 30 12:56:42 crc kubenswrapper[4672]: I0930 12:56:42.249474 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v27qr" event={"ID":"cee97e22-15e3-47f2-a776-66689a174c11","Type":"ContainerStarted","Data":"029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9"} Sep 30 12:56:43 crc kubenswrapper[4672]: I0930 12:56:43.262642 4672 generic.go:334] "Generic (PLEG): container finished" podID="cee97e22-15e3-47f2-a776-66689a174c11" containerID="029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9" exitCode=0 Sep 30 12:56:43 crc kubenswrapper[4672]: I0930 12:56:43.262743 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v27qr" event={"ID":"cee97e22-15e3-47f2-a776-66689a174c11","Type":"ContainerDied","Data":"029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9"} Sep 30 12:56:44 crc kubenswrapper[4672]: I0930 12:56:44.275330 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v27qr" event={"ID":"cee97e22-15e3-47f2-a776-66689a174c11","Type":"ContainerStarted","Data":"9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be"} Sep 30 12:56:44 crc kubenswrapper[4672]: I0930 12:56:44.308019 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v27qr" podStartSLOduration=2.605805588 podStartE2EDuration="6.30797855s" podCreationTimestamp="2025-09-30 12:56:38 +0000 UTC" firstStartedPulling="2025-09-30 12:56:40.224062833 +0000 UTC m=+2091.493300479" lastFinishedPulling="2025-09-30 12:56:43.926235795 +0000 UTC m=+2095.195473441" observedRunningTime="2025-09-30 12:56:44.295587834 +0000 UTC m=+2095.564825480" watchObservedRunningTime="2025-09-30 12:56:44.30797855 +0000 UTC m=+2095.577216206" Sep 30 12:56:49 crc kubenswrapper[4672]: I0930 12:56:49.341619 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:49 crc kubenswrapper[4672]: I0930 12:56:49.342179 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:50 crc kubenswrapper[4672]: I0930 12:56:50.413071 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v27qr" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="registry-server" probeResult="failure" output=< Sep 30 12:56:50 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Sep 30 12:56:50 crc kubenswrapper[4672]: > Sep 30 12:56:54 crc kubenswrapper[4672]: I0930 12:56:54.740366 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:56:54 crc kubenswrapper[4672]: I0930 12:56:54.741102 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:56:59 crc kubenswrapper[4672]: I0930 12:56:59.387497 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:59 crc kubenswrapper[4672]: I0930 12:56:59.448951 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:56:59 crc kubenswrapper[4672]: I0930 12:56:59.630353 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v27qr"] Sep 30 12:57:00 crc kubenswrapper[4672]: I0930 12:57:00.457511 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v27qr" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="registry-server" containerID="cri-o://9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be" gracePeriod=2 Sep 30 12:57:00 crc kubenswrapper[4672]: I0930 12:57:00.961470 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.026905 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-utilities\") pod \"cee97e22-15e3-47f2-a776-66689a174c11\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.027060 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-catalog-content\") pod \"cee97e22-15e3-47f2-a776-66689a174c11\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.027197 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvgb\" (UniqueName: \"kubernetes.io/projected/cee97e22-15e3-47f2-a776-66689a174c11-kube-api-access-4zvgb\") pod \"cee97e22-15e3-47f2-a776-66689a174c11\" (UID: \"cee97e22-15e3-47f2-a776-66689a174c11\") " Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.027906 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-utilities" (OuterVolumeSpecName: "utilities") pod "cee97e22-15e3-47f2-a776-66689a174c11" (UID: "cee97e22-15e3-47f2-a776-66689a174c11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.032453 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee97e22-15e3-47f2-a776-66689a174c11-kube-api-access-4zvgb" (OuterVolumeSpecName: "kube-api-access-4zvgb") pod "cee97e22-15e3-47f2-a776-66689a174c11" (UID: "cee97e22-15e3-47f2-a776-66689a174c11"). InnerVolumeSpecName "kube-api-access-4zvgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.107592 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cee97e22-15e3-47f2-a776-66689a174c11" (UID: "cee97e22-15e3-47f2-a776-66689a174c11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.130129 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.130155 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee97e22-15e3-47f2-a776-66689a174c11-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.130166 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvgb\" (UniqueName: \"kubernetes.io/projected/cee97e22-15e3-47f2-a776-66689a174c11-kube-api-access-4zvgb\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.467510 4672 generic.go:334] "Generic (PLEG): container finished" podID="cee97e22-15e3-47f2-a776-66689a174c11" containerID="9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be" exitCode=0 Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.467561 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v27qr" event={"ID":"cee97e22-15e3-47f2-a776-66689a174c11","Type":"ContainerDied","Data":"9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be"} Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.467599 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v27qr" event={"ID":"cee97e22-15e3-47f2-a776-66689a174c11","Type":"ContainerDied","Data":"c8e753587874929a6d01ede8926c26265d637909210705a07f6efe7dcfb80fe4"} Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.467614 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v27qr" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.467620 4672 scope.go:117] "RemoveContainer" containerID="9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.499743 4672 scope.go:117] "RemoveContainer" containerID="029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.505483 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v27qr"] Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.519907 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v27qr"] Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.523952 4672 scope.go:117] "RemoveContainer" containerID="047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.565675 4672 scope.go:117] "RemoveContainer" containerID="9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be" Sep 30 12:57:01 crc kubenswrapper[4672]: E0930 12:57:01.566220 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be\": container with ID starting with 9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be not found: ID does not exist" containerID="9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.566280 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be"} err="failed to get container status \"9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be\": rpc error: code = NotFound desc = could not find container \"9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be\": container with ID starting with 9efdbd144337bc7b38614cda868f5bdef357bc03916a22ef7298ec37c3ad22be not found: ID does not exist" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.566309 4672 scope.go:117] "RemoveContainer" containerID="029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9" Sep 30 12:57:01 crc kubenswrapper[4672]: E0930 12:57:01.566722 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9\": container with ID starting with 029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9 not found: ID does not exist" containerID="029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.566744 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9"} err="failed to get container status \"029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9\": rpc error: code = NotFound desc = could not find container \"029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9\": container with ID starting with 029d05c94d08970043016825c533b286a508811ea3f0f3f7f19922536c87b3d9 not found: ID does not exist" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.566758 4672 scope.go:117] "RemoveContainer" containerID="047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db" Sep 30 12:57:01 crc kubenswrapper[4672]: E0930 12:57:01.567054 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db\": container with ID starting with 047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db not found: ID does not exist" containerID="047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db" Sep 30 12:57:01 crc kubenswrapper[4672]: I0930 12:57:01.567089 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db"} err="failed to get container status \"047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db\": rpc error: code = NotFound desc = could not find container \"047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db\": container with ID starting with 047931717dbeb2bbac71840f67079ccc8965c459f99dc178d61f8f9fc0f383db not found: ID does not exist" Sep 30 12:57:03 crc kubenswrapper[4672]: I0930 12:57:03.439359 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee97e22-15e3-47f2-a776-66689a174c11" path="/var/lib/kubelet/pods/cee97e22-15e3-47f2-a776-66689a174c11/volumes" Sep 30 12:57:08 crc kubenswrapper[4672]: I0930 12:57:08.547705 4672 generic.go:334] "Generic (PLEG): container finished" podID="de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" containerID="4fb9dbbee8c139194d01ba7d7d7a49c3dfd741807aad680debfaee968738c285" exitCode=0 Sep 30 12:57:08 crc kubenswrapper[4672]: I0930 12:57:08.547809 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" event={"ID":"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16","Type":"ContainerDied","Data":"4fb9dbbee8c139194d01ba7d7d7a49c3dfd741807aad680debfaee968738c285"} Sep 30 12:57:09 crc kubenswrapper[4672]: I0930 12:57:09.958569 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.110424 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovncontroller-config-0\") pod \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.110574 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovn-combined-ca-bundle\") pod \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.110601 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ssh-key\") pod \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.110655 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm5pv\" (UniqueName: \"kubernetes.io/projected/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-kube-api-access-dm5pv\") pod \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.111193 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-inventory\") pod \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\" (UID: \"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16\") " Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.115592 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-kube-api-access-dm5pv" (OuterVolumeSpecName: "kube-api-access-dm5pv") pod "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" (UID: "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16"). InnerVolumeSpecName "kube-api-access-dm5pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.115799 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" (UID: "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.151488 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" (UID: "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.155225 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" (UID: "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.162329 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-inventory" (OuterVolumeSpecName: "inventory") pod "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" (UID: "de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.213307 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.213342 4672 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.213357 4672 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.213368 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.213381 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm5pv\" (UniqueName: \"kubernetes.io/projected/de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16-kube-api-access-dm5pv\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.573069 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" event={"ID":"de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16","Type":"ContainerDied","Data":"2f1c7abb862431d317e3cc77807bfb45fe9c6420a80393abd46ae5737992bcb9"} Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.573408 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1c7abb862431d317e3cc77807bfb45fe9c6420a80393abd46ae5737992bcb9" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.573135 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mfrgw" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.670447 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64"] Sep 30 12:57:10 crc kubenswrapper[4672]: E0930 12:57:10.670907 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="registry-server" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.670926 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="registry-server" Sep 30 12:57:10 crc kubenswrapper[4672]: E0930 12:57:10.670957 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.670963 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 12:57:10 crc kubenswrapper[4672]: E0930 12:57:10.670979 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="extract-content" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.670985 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="extract-content" Sep 30 12:57:10 crc kubenswrapper[4672]: E0930 12:57:10.670999 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="extract-utilities" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.671005 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="extract-utilities" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.671203 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.671235 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee97e22-15e3-47f2-a776-66689a174c11" containerName="registry-server" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.671914 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.674411 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.679917 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.680109 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.681367 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.681418 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.682069 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.693340 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64"] Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.724039 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.724098 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.724126 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t69b\" (UniqueName: \"kubernetes.io/projected/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-kube-api-access-8t69b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.724215 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.724311 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.724419 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.825828 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.825885 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.825926 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.825954 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t69b\" (UniqueName: \"kubernetes.io/projected/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-kube-api-access-8t69b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.826016 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.826078 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.830571 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.831312 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.831430 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.832523 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.832716 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:10 crc kubenswrapper[4672]: I0930 12:57:10.841427 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t69b\" (UniqueName: \"kubernetes.io/projected/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-kube-api-access-8t69b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:11 crc kubenswrapper[4672]: I0930 12:57:11.007859 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:57:11 crc kubenswrapper[4672]: I0930 12:57:11.541137 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64"] Sep 30 12:57:12 crc kubenswrapper[4672]: I0930 12:57:12.599482 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" event={"ID":"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6","Type":"ContainerStarted","Data":"5977e1e3934f3dd2634a3c2a11bc23fc6fe5eae6739e967d202b86cd389b9ef9"} Sep 30 12:57:12 crc kubenswrapper[4672]: I0930 12:57:12.599830 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" event={"ID":"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6","Type":"ContainerStarted","Data":"65bdd29c0bdddfb5a25af95ede9e64b05c49a6201e156bf5f59d0c9a0a032562"} Sep 30 12:57:12 crc kubenswrapper[4672]: I0930 12:57:12.626680 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" podStartSLOduration=2.01305047 podStartE2EDuration="2.626656852s" podCreationTimestamp="2025-09-30 12:57:10 +0000 UTC" firstStartedPulling="2025-09-30 12:57:11.578395328 +0000 UTC m=+2122.847632974" lastFinishedPulling="2025-09-30 12:57:12.19200167 +0000 UTC m=+2123.461239356" observedRunningTime="2025-09-30 12:57:12.619685435 +0000 UTC m=+2123.888923081" watchObservedRunningTime="2025-09-30 12:57:12.626656852 +0000 UTC m=+2123.895894518" Sep 30 12:57:13 crc kubenswrapper[4672]: I0930 12:57:13.936256 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vgxl2"] Sep 30 12:57:13 crc kubenswrapper[4672]: I0930 12:57:13.938136 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.016556 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgxl2"] Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.026156 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-catalog-content\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.026222 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-296p4\" (UniqueName: \"kubernetes.io/projected/3ac9662b-b58d-4663-8632-2b7c467b21f4-kube-api-access-296p4\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.026471 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-utilities\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.128801 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-utilities\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.128932 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-catalog-content\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.128995 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-296p4\" (UniqueName: \"kubernetes.io/projected/3ac9662b-b58d-4663-8632-2b7c467b21f4-kube-api-access-296p4\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.129309 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-utilities\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.129373 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-catalog-content\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.156780 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-296p4\" (UniqueName: \"kubernetes.io/projected/3ac9662b-b58d-4663-8632-2b7c467b21f4-kube-api-access-296p4\") pod \"redhat-marketplace-vgxl2\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.265332 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:14 crc kubenswrapper[4672]: I0930 12:57:14.758238 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgxl2"] Sep 30 12:57:15 crc kubenswrapper[4672]: I0930 12:57:15.627911 4672 generic.go:334] "Generic (PLEG): container finished" podID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerID="0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5" exitCode=0 Sep 30 12:57:15 crc kubenswrapper[4672]: I0930 12:57:15.628318 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgxl2" event={"ID":"3ac9662b-b58d-4663-8632-2b7c467b21f4","Type":"ContainerDied","Data":"0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5"} Sep 30 12:57:15 crc kubenswrapper[4672]: I0930 12:57:15.628349 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgxl2" event={"ID":"3ac9662b-b58d-4663-8632-2b7c467b21f4","Type":"ContainerStarted","Data":"351ac71ca413c72d288815761ad65ead7d55afe3de8c2f59123884e736a90f57"} Sep 30 12:57:17 crc kubenswrapper[4672]: I0930 12:57:17.648095 4672 generic.go:334] "Generic (PLEG): container finished" podID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerID="635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c" exitCode=0 Sep 30 12:57:17 crc kubenswrapper[4672]: I0930 12:57:17.648183 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgxl2" event={"ID":"3ac9662b-b58d-4663-8632-2b7c467b21f4","Type":"ContainerDied","Data":"635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c"} Sep 30 12:57:18 crc kubenswrapper[4672]: I0930 12:57:18.671727 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgxl2" event={"ID":"3ac9662b-b58d-4663-8632-2b7c467b21f4","Type":"ContainerStarted","Data":"22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389"} Sep 30 12:57:18 crc kubenswrapper[4672]: I0930 12:57:18.697249 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vgxl2" podStartSLOduration=3.143416441 podStartE2EDuration="5.697233899s" podCreationTimestamp="2025-09-30 12:57:13 +0000 UTC" firstStartedPulling="2025-09-30 12:57:15.630767442 +0000 UTC m=+2126.900005078" lastFinishedPulling="2025-09-30 12:57:18.18458487 +0000 UTC m=+2129.453822536" observedRunningTime="2025-09-30 12:57:18.689322468 +0000 UTC m=+2129.958560134" watchObservedRunningTime="2025-09-30 12:57:18.697233899 +0000 UTC m=+2129.966471545" Sep 30 12:57:24 crc kubenswrapper[4672]: I0930 12:57:24.266167 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:24 crc kubenswrapper[4672]: I0930 12:57:24.266776 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:24 crc kubenswrapper[4672]: I0930 12:57:24.312999 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:24 crc kubenswrapper[4672]: I0930 12:57:24.739501 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:57:24 crc kubenswrapper[4672]: I0930 12:57:24.739573 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:57:24 crc kubenswrapper[4672]: I0930 12:57:24.784753 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:24 crc kubenswrapper[4672]: I0930 12:57:24.839798 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgxl2"] Sep 30 12:57:26 crc kubenswrapper[4672]: I0930 12:57:26.759018 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vgxl2" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="registry-server" containerID="cri-o://22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389" gracePeriod=2 Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.223010 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.239822 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-296p4\" (UniqueName: \"kubernetes.io/projected/3ac9662b-b58d-4663-8632-2b7c467b21f4-kube-api-access-296p4\") pod \"3ac9662b-b58d-4663-8632-2b7c467b21f4\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.239915 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-utilities\") pod \"3ac9662b-b58d-4663-8632-2b7c467b21f4\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.240183 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-catalog-content\") pod \"3ac9662b-b58d-4663-8632-2b7c467b21f4\" (UID: \"3ac9662b-b58d-4663-8632-2b7c467b21f4\") " Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.241138 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-utilities" (OuterVolumeSpecName: "utilities") pod "3ac9662b-b58d-4663-8632-2b7c467b21f4" (UID: "3ac9662b-b58d-4663-8632-2b7c467b21f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.248558 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac9662b-b58d-4663-8632-2b7c467b21f4-kube-api-access-296p4" (OuterVolumeSpecName: "kube-api-access-296p4") pod "3ac9662b-b58d-4663-8632-2b7c467b21f4" (UID: "3ac9662b-b58d-4663-8632-2b7c467b21f4"). InnerVolumeSpecName "kube-api-access-296p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.256129 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ac9662b-b58d-4663-8632-2b7c467b21f4" (UID: "3ac9662b-b58d-4663-8632-2b7c467b21f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.342236 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.342288 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-296p4\" (UniqueName: \"kubernetes.io/projected/3ac9662b-b58d-4663-8632-2b7c467b21f4-kube-api-access-296p4\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.342299 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac9662b-b58d-4663-8632-2b7c467b21f4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.769341 4672 generic.go:334] "Generic (PLEG): container finished" podID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerID="22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389" exitCode=0 Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.769411 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgxl2" event={"ID":"3ac9662b-b58d-4663-8632-2b7c467b21f4","Type":"ContainerDied","Data":"22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389"} Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.769667 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgxl2" event={"ID":"3ac9662b-b58d-4663-8632-2b7c467b21f4","Type":"ContainerDied","Data":"351ac71ca413c72d288815761ad65ead7d55afe3de8c2f59123884e736a90f57"} Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.769703 4672 scope.go:117] "RemoveContainer" containerID="22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.769430 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgxl2" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.791580 4672 scope.go:117] "RemoveContainer" containerID="635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.800814 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgxl2"] Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.809394 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgxl2"] Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.825780 4672 scope.go:117] "RemoveContainer" containerID="0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.863964 4672 scope.go:117] "RemoveContainer" containerID="22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389" Sep 30 12:57:27 crc kubenswrapper[4672]: E0930 12:57:27.864438 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389\": container with ID starting with 22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389 not found: ID does not exist" containerID="22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.864468 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389"} err="failed to get container status \"22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389\": rpc error: code = NotFound desc = could not find container \"22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389\": container with ID starting with 22e8c134f09c56f78573f89cb8de9e3e660916c310c130093ab7f6755dfaf389 not found: ID does not exist" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.864489 4672 scope.go:117] "RemoveContainer" containerID="635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c" Sep 30 12:57:27 crc kubenswrapper[4672]: E0930 12:57:27.864910 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c\": container with ID starting with 635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c not found: ID does not exist" containerID="635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.864957 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c"} err="failed to get container status \"635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c\": rpc error: code = NotFound desc = could not find container \"635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c\": container with ID starting with 635578625bdfb30018adba855fadf3dfe668421f9199111e7d834dc57cf6936c not found: ID does not exist" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.864982 4672 scope.go:117] "RemoveContainer" containerID="0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5" Sep 30 12:57:27 crc kubenswrapper[4672]: E0930 12:57:27.865273 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5\": container with ID starting with 0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5 not found: ID does not exist" containerID="0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5" Sep 30 12:57:27 crc kubenswrapper[4672]: I0930 12:57:27.865299 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5"} err="failed to get container status \"0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5\": rpc error: code = NotFound desc = could not find container \"0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5\": container with ID starting with 0957230a56811b6e198760931b815405ee90d25e823763062adbb9f4086959f5 not found: ID does not exist" Sep 30 12:57:29 crc kubenswrapper[4672]: I0930 12:57:29.430980 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" path="/var/lib/kubelet/pods/3ac9662b-b58d-4663-8632-2b7c467b21f4/volumes" Sep 30 12:57:54 crc kubenswrapper[4672]: I0930 12:57:54.739928 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 12:57:54 crc kubenswrapper[4672]: I0930 12:57:54.740685 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 12:57:54 crc kubenswrapper[4672]: I0930 12:57:54.740753 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 12:57:54 crc kubenswrapper[4672]: I0930 12:57:54.741916 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2787254d23886a5d5ca32405d4feb4336d5e1be3c00953ebe36b2065fda6f64"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 12:57:54 crc kubenswrapper[4672]: I0930 12:57:54.742021 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://c2787254d23886a5d5ca32405d4feb4336d5e1be3c00953ebe36b2065fda6f64" gracePeriod=600 Sep 30 12:57:55 crc kubenswrapper[4672]: I0930 12:57:55.081788 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="c2787254d23886a5d5ca32405d4feb4336d5e1be3c00953ebe36b2065fda6f64" exitCode=0 Sep 30 12:57:55 crc kubenswrapper[4672]: I0930 12:57:55.082092 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"c2787254d23886a5d5ca32405d4feb4336d5e1be3c00953ebe36b2065fda6f64"} Sep 30 12:57:55 crc kubenswrapper[4672]: I0930 12:57:55.082120 4672 scope.go:117] "RemoveContainer" containerID="221a226e9b221f859d2addc5ed9e63e60339b2ca4ab8eb0848430a16de2a187e" Sep 30 12:57:56 crc kubenswrapper[4672]: I0930 12:57:56.092162 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe"} Sep 30 12:58:07 crc kubenswrapper[4672]: I0930 12:58:07.226356 4672 generic.go:334] "Generic (PLEG): container finished" podID="ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" containerID="5977e1e3934f3dd2634a3c2a11bc23fc6fe5eae6739e967d202b86cd389b9ef9" exitCode=0 Sep 30 12:58:07 crc kubenswrapper[4672]: I0930 12:58:07.226458 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" event={"ID":"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6","Type":"ContainerDied","Data":"5977e1e3934f3dd2634a3c2a11bc23fc6fe5eae6739e967d202b86cd389b9ef9"} Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.680297 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.792039 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.792240 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t69b\" (UniqueName: \"kubernetes.io/projected/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-kube-api-access-8t69b\") pod \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.792395 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-ssh-key\") pod \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.792464 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-metadata-combined-ca-bundle\") pod \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.792688 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-nova-metadata-neutron-config-0\") pod \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.792719 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-inventory\") pod \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\" (UID: \"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6\") " Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.800040 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" (UID: "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.817729 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-kube-api-access-8t69b" (OuterVolumeSpecName: "kube-api-access-8t69b") pod "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" (UID: "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6"). InnerVolumeSpecName "kube-api-access-8t69b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.828561 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" (UID: "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.833936 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" (UID: "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.834471 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" (UID: "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.834908 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-inventory" (OuterVolumeSpecName: "inventory") pod "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" (UID: "ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.895080 4672 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.895114 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t69b\" (UniqueName: \"kubernetes.io/projected/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-kube-api-access-8t69b\") on node \"crc\" DevicePath \"\"" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.895124 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.895133 4672 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.895143 4672 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 12:58:08 crc kubenswrapper[4672]: I0930 12:58:08.895151 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.254720 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" event={"ID":"ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6","Type":"ContainerDied","Data":"65bdd29c0bdddfb5a25af95ede9e64b05c49a6201e156bf5f59d0c9a0a032562"} Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.254784 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65bdd29c0bdddfb5a25af95ede9e64b05c49a6201e156bf5f59d0c9a0a032562" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.254840 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.355875 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z"] Sep 30 12:58:09 crc kubenswrapper[4672]: E0930 12:58:09.356296 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="extract-utilities" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.356312 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="extract-utilities" Sep 30 12:58:09 crc kubenswrapper[4672]: E0930 12:58:09.356322 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="extract-content" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.356328 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="extract-content" Sep 30 12:58:09 crc kubenswrapper[4672]: E0930 12:58:09.356357 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="registry-server" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.356362 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="registry-server" Sep 30 12:58:09 crc kubenswrapper[4672]: E0930 12:58:09.356392 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.356401 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.356573 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.356597 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac9662b-b58d-4663-8632-2b7c467b21f4" containerName="registry-server" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.357250 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.359116 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.359583 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.360562 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.360679 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.361314 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.378463 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z"] Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.507655 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.507741 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.507794 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.507970 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwgcf\" (UniqueName: \"kubernetes.io/projected/883bcbaa-0233-4f4d-8463-f451155bc618-kube-api-access-bwgcf\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.508053 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.609571 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.609654 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.609734 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.609822 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwgcf\" (UniqueName: \"kubernetes.io/projected/883bcbaa-0233-4f4d-8463-f451155bc618-kube-api-access-bwgcf\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.609899 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.615326 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.615326 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.615389 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.617444 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.643009 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwgcf\" (UniqueName: \"kubernetes.io/projected/883bcbaa-0233-4f4d-8463-f451155bc618-kube-api-access-bwgcf\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zq47z\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:09 crc kubenswrapper[4672]: I0930 12:58:09.712968 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 12:58:10 crc kubenswrapper[4672]: I0930 12:58:10.263406 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z"] Sep 30 12:58:10 crc kubenswrapper[4672]: W0930 12:58:10.266668 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod883bcbaa_0233_4f4d_8463_f451155bc618.slice/crio-77d462e48b7cba520184be81fa7c64a925db57870f819d2eeed740c1126de299 WatchSource:0}: Error finding container 77d462e48b7cba520184be81fa7c64a925db57870f819d2eeed740c1126de299: Status 404 returned error can't find the container with id 77d462e48b7cba520184be81fa7c64a925db57870f819d2eeed740c1126de299 Sep 30 12:58:11 crc kubenswrapper[4672]: I0930 12:58:11.281453 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" event={"ID":"883bcbaa-0233-4f4d-8463-f451155bc618","Type":"ContainerStarted","Data":"8883d0655966ee5e00cced8c287e2cbd4182fe6f564e365fce56c6b557b933c6"} Sep 30 12:58:11 crc kubenswrapper[4672]: I0930 12:58:11.281755 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" event={"ID":"883bcbaa-0233-4f4d-8463-f451155bc618","Type":"ContainerStarted","Data":"77d462e48b7cba520184be81fa7c64a925db57870f819d2eeed740c1126de299"} Sep 30 12:58:11 crc kubenswrapper[4672]: I0930 12:58:11.304307 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" podStartSLOduration=1.798022883 podStartE2EDuration="2.304248095s" podCreationTimestamp="2025-09-30 12:58:09 +0000 UTC" firstStartedPulling="2025-09-30 12:58:10.26978679 +0000 UTC m=+2181.539024456" lastFinishedPulling="2025-09-30 12:58:10.776012022 +0000 UTC m=+2182.045249668" observedRunningTime="2025-09-30 12:58:11.302401878 +0000 UTC m=+2182.571639534" watchObservedRunningTime="2025-09-30 12:58:11.304248095 +0000 UTC m=+2182.573485751" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.156054 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2"] Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.158790 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.160933 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.161212 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.181387 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2"] Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.207508 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44952\" (UniqueName: \"kubernetes.io/projected/81f77c7a-8774-40ea-87ad-faefa20f03b8-kube-api-access-44952\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.207817 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f77c7a-8774-40ea-87ad-faefa20f03b8-config-volume\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.208094 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f77c7a-8774-40ea-87ad-faefa20f03b8-secret-volume\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.309144 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f77c7a-8774-40ea-87ad-faefa20f03b8-secret-volume\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.309203 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44952\" (UniqueName: \"kubernetes.io/projected/81f77c7a-8774-40ea-87ad-faefa20f03b8-kube-api-access-44952\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.309230 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f77c7a-8774-40ea-87ad-faefa20f03b8-config-volume\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.310194 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f77c7a-8774-40ea-87ad-faefa20f03b8-config-volume\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.317566 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f77c7a-8774-40ea-87ad-faefa20f03b8-secret-volume\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.346894 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44952\" (UniqueName: \"kubernetes.io/projected/81f77c7a-8774-40ea-87ad-faefa20f03b8-kube-api-access-44952\") pod \"collect-profiles-29320620-lknc2\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.483688 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:00 crc kubenswrapper[4672]: I0930 13:00:00.942188 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2"] Sep 30 13:00:01 crc kubenswrapper[4672]: I0930 13:00:01.420164 4672 generic.go:334] "Generic (PLEG): container finished" podID="81f77c7a-8774-40ea-87ad-faefa20f03b8" containerID="d80b3db4a416dd0f18afe0d63e2ccfb72e2af28fbf229c1d062e7e22d2091935" exitCode=0 Sep 30 13:00:01 crc kubenswrapper[4672]: I0930 13:00:01.430136 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" event={"ID":"81f77c7a-8774-40ea-87ad-faefa20f03b8","Type":"ContainerDied","Data":"d80b3db4a416dd0f18afe0d63e2ccfb72e2af28fbf229c1d062e7e22d2091935"} Sep 30 13:00:01 crc kubenswrapper[4672]: I0930 13:00:01.430185 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" event={"ID":"81f77c7a-8774-40ea-87ad-faefa20f03b8","Type":"ContainerStarted","Data":"367ba3178cf4f599547f87d0e80169c781b3ed63c5c9ce6e54f4f6977e1ee48f"} Sep 30 13:00:02 crc kubenswrapper[4672]: I0930 13:00:02.850804 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:02 crc kubenswrapper[4672]: I0930 13:00:02.957347 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44952\" (UniqueName: \"kubernetes.io/projected/81f77c7a-8774-40ea-87ad-faefa20f03b8-kube-api-access-44952\") pod \"81f77c7a-8774-40ea-87ad-faefa20f03b8\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " Sep 30 13:00:02 crc kubenswrapper[4672]: I0930 13:00:02.957482 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f77c7a-8774-40ea-87ad-faefa20f03b8-config-volume\") pod \"81f77c7a-8774-40ea-87ad-faefa20f03b8\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " Sep 30 13:00:02 crc kubenswrapper[4672]: I0930 13:00:02.957603 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f77c7a-8774-40ea-87ad-faefa20f03b8-secret-volume\") pod \"81f77c7a-8774-40ea-87ad-faefa20f03b8\" (UID: \"81f77c7a-8774-40ea-87ad-faefa20f03b8\") " Sep 30 13:00:02 crc kubenswrapper[4672]: I0930 13:00:02.959237 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f77c7a-8774-40ea-87ad-faefa20f03b8-config-volume" (OuterVolumeSpecName: "config-volume") pod "81f77c7a-8774-40ea-87ad-faefa20f03b8" (UID: "81f77c7a-8774-40ea-87ad-faefa20f03b8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:00:02 crc kubenswrapper[4672]: I0930 13:00:02.964444 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f77c7a-8774-40ea-87ad-faefa20f03b8-kube-api-access-44952" (OuterVolumeSpecName: "kube-api-access-44952") pod "81f77c7a-8774-40ea-87ad-faefa20f03b8" (UID: "81f77c7a-8774-40ea-87ad-faefa20f03b8"). InnerVolumeSpecName "kube-api-access-44952". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:00:02 crc kubenswrapper[4672]: I0930 13:00:02.964844 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f77c7a-8774-40ea-87ad-faefa20f03b8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81f77c7a-8774-40ea-87ad-faefa20f03b8" (UID: "81f77c7a-8774-40ea-87ad-faefa20f03b8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.061356 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44952\" (UniqueName: \"kubernetes.io/projected/81f77c7a-8774-40ea-87ad-faefa20f03b8-kube-api-access-44952\") on node \"crc\" DevicePath \"\"" Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.061388 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f77c7a-8774-40ea-87ad-faefa20f03b8-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.061396 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f77c7a-8774-40ea-87ad-faefa20f03b8-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.440748 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" event={"ID":"81f77c7a-8774-40ea-87ad-faefa20f03b8","Type":"ContainerDied","Data":"367ba3178cf4f599547f87d0e80169c781b3ed63c5c9ce6e54f4f6977e1ee48f"} Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.441100 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="367ba3178cf4f599547f87d0e80169c781b3ed63c5c9ce6e54f4f6977e1ee48f" Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.440817 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2" Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.930922 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2"] Sep 30 13:00:03 crc kubenswrapper[4672]: I0930 13:00:03.941772 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320575-tsgk2"] Sep 30 13:00:05 crc kubenswrapper[4672]: I0930 13:00:05.437587 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acf05e1-f152-4432-b0b2-a44b242d0308" path="/var/lib/kubelet/pods/2acf05e1-f152-4432-b0b2-a44b242d0308/volumes" Sep 30 13:00:24 crc kubenswrapper[4672]: I0930 13:00:24.740339 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:00:24 crc kubenswrapper[4672]: I0930 13:00:24.740945 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.789708 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b2lh4"] Sep 30 13:00:53 crc kubenswrapper[4672]: E0930 13:00:53.793082 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f77c7a-8774-40ea-87ad-faefa20f03b8" containerName="collect-profiles" Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.793278 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f77c7a-8774-40ea-87ad-faefa20f03b8" containerName="collect-profiles" Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.793637 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f77c7a-8774-40ea-87ad-faefa20f03b8" containerName="collect-profiles" Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.796357 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.809317 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2lh4"] Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.940442 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7184a4fd-1911-473b-83c4-c5c224130bb3-utilities\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.940501 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7184a4fd-1911-473b-83c4-c5c224130bb3-catalog-content\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:53 crc kubenswrapper[4672]: I0930 13:00:53.940784 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgllv\" (UniqueName: \"kubernetes.io/projected/7184a4fd-1911-473b-83c4-c5c224130bb3-kube-api-access-vgllv\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.042623 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7184a4fd-1911-473b-83c4-c5c224130bb3-utilities\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.042670 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7184a4fd-1911-473b-83c4-c5c224130bb3-catalog-content\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.042800 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgllv\" (UniqueName: \"kubernetes.io/projected/7184a4fd-1911-473b-83c4-c5c224130bb3-kube-api-access-vgllv\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.043390 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7184a4fd-1911-473b-83c4-c5c224130bb3-utilities\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.043450 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7184a4fd-1911-473b-83c4-c5c224130bb3-catalog-content\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.065254 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgllv\" (UniqueName: \"kubernetes.io/projected/7184a4fd-1911-473b-83c4-c5c224130bb3-kube-api-access-vgllv\") pod \"community-operators-b2lh4\" (UID: \"7184a4fd-1911-473b-83c4-c5c224130bb3\") " pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.178824 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.633112 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2lh4"] Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.739595 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:00:54 crc kubenswrapper[4672]: I0930 13:00:54.739668 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:00:55 crc kubenswrapper[4672]: I0930 13:00:55.014794 4672 generic.go:334] "Generic (PLEG): container finished" podID="7184a4fd-1911-473b-83c4-c5c224130bb3" containerID="6cc75bae19500db1051305e566fac1cf23a62d1c77fe9ec077591626b9e5af40" exitCode=0 Sep 30 13:00:55 crc kubenswrapper[4672]: I0930 13:00:55.014849 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2lh4" event={"ID":"7184a4fd-1911-473b-83c4-c5c224130bb3","Type":"ContainerDied","Data":"6cc75bae19500db1051305e566fac1cf23a62d1c77fe9ec077591626b9e5af40"} Sep 30 13:00:55 crc kubenswrapper[4672]: I0930 13:00:55.014880 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2lh4" event={"ID":"7184a4fd-1911-473b-83c4-c5c224130bb3","Type":"ContainerStarted","Data":"5a987bdf8696de104b4d798ac812f462c3e6c7c9209aafc07ed03b2062107aba"} Sep 30 13:00:55 crc kubenswrapper[4672]: I0930 13:00:55.017078 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.191974 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320621-r2bv7"] Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.200816 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.211536 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320621-r2bv7"] Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.387643 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-config-data\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.387697 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-combined-ca-bundle\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.388109 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-fernet-keys\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.388752 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2fvh\" (UniqueName: \"kubernetes.io/projected/a0837a49-9f57-447b-8da5-feef49bf42f0-kube-api-access-d2fvh\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.490750 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-fernet-keys\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.490931 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2fvh\" (UniqueName: \"kubernetes.io/projected/a0837a49-9f57-447b-8da5-feef49bf42f0-kube-api-access-d2fvh\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.490972 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-config-data\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.490996 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-combined-ca-bundle\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.499624 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-fernet-keys\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.500584 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-config-data\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.511542 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-combined-ca-bundle\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.515302 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2fvh\" (UniqueName: \"kubernetes.io/projected/a0837a49-9f57-447b-8da5-feef49bf42f0-kube-api-access-d2fvh\") pod \"keystone-cron-29320621-r2bv7\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.578999 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:00 crc kubenswrapper[4672]: I0930 13:01:00.898954 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320621-r2bv7"] Sep 30 13:01:00 crc kubenswrapper[4672]: W0930 13:01:00.905253 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0837a49_9f57_447b_8da5_feef49bf42f0.slice/crio-549fce96de129a3f9f86a3bbd0bb2b05d4801910001ef6454b165db398977106 WatchSource:0}: Error finding container 549fce96de129a3f9f86a3bbd0bb2b05d4801910001ef6454b165db398977106: Status 404 returned error can't find the container with id 549fce96de129a3f9f86a3bbd0bb2b05d4801910001ef6454b165db398977106 Sep 30 13:01:01 crc kubenswrapper[4672]: I0930 13:01:01.077982 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320621-r2bv7" event={"ID":"a0837a49-9f57-447b-8da5-feef49bf42f0","Type":"ContainerStarted","Data":"549fce96de129a3f9f86a3bbd0bb2b05d4801910001ef6454b165db398977106"} Sep 30 13:01:01 crc kubenswrapper[4672]: I0930 13:01:01.081131 4672 generic.go:334] "Generic (PLEG): container finished" podID="7184a4fd-1911-473b-83c4-c5c224130bb3" containerID="d65b58d02968bddcf6e58df5c096952dcc34b18fa15586cdb3bd4bfa09c10adf" exitCode=0 Sep 30 13:01:01 crc kubenswrapper[4672]: I0930 13:01:01.081181 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2lh4" event={"ID":"7184a4fd-1911-473b-83c4-c5c224130bb3","Type":"ContainerDied","Data":"d65b58d02968bddcf6e58df5c096952dcc34b18fa15586cdb3bd4bfa09c10adf"} Sep 30 13:01:01 crc kubenswrapper[4672]: I0930 13:01:01.430213 4672 scope.go:117] "RemoveContainer" containerID="2b5948e4f052ce096dd3fca3d3b8f8343dc8d91ed7163efd1bc1fbd512cd3647" Sep 30 13:01:02 crc kubenswrapper[4672]: I0930 13:01:02.098419 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320621-r2bv7" event={"ID":"a0837a49-9f57-447b-8da5-feef49bf42f0","Type":"ContainerStarted","Data":"53669880eab2785d8e71198f47bedaff451af04edc1e01ec5c0ee570e645d4a6"} Sep 30 13:01:02 crc kubenswrapper[4672]: I0930 13:01:02.126930 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320621-r2bv7" podStartSLOduration=2.126911111 podStartE2EDuration="2.126911111s" podCreationTimestamp="2025-09-30 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:01:02.120734153 +0000 UTC m=+2353.389971799" watchObservedRunningTime="2025-09-30 13:01:02.126911111 +0000 UTC m=+2353.396148777" Sep 30 13:01:03 crc kubenswrapper[4672]: I0930 13:01:03.110750 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2lh4" event={"ID":"7184a4fd-1911-473b-83c4-c5c224130bb3","Type":"ContainerStarted","Data":"cbfdd3db9f55b5f5b75179715482a2a70834b4d325cd9321bacbe063f2843dea"} Sep 30 13:01:03 crc kubenswrapper[4672]: I0930 13:01:03.129595 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b2lh4" podStartSLOduration=3.183133387 podStartE2EDuration="10.129575314s" podCreationTimestamp="2025-09-30 13:00:53 +0000 UTC" firstStartedPulling="2025-09-30 13:00:55.016849364 +0000 UTC m=+2346.286087010" lastFinishedPulling="2025-09-30 13:01:01.963291291 +0000 UTC m=+2353.232528937" observedRunningTime="2025-09-30 13:01:03.127516002 +0000 UTC m=+2354.396753668" watchObservedRunningTime="2025-09-30 13:01:03.129575314 +0000 UTC m=+2354.398812970" Sep 30 13:01:04 crc kubenswrapper[4672]: I0930 13:01:04.122441 4672 generic.go:334] "Generic (PLEG): container finished" podID="a0837a49-9f57-447b-8da5-feef49bf42f0" containerID="53669880eab2785d8e71198f47bedaff451af04edc1e01ec5c0ee570e645d4a6" exitCode=0 Sep 30 13:01:04 crc kubenswrapper[4672]: I0930 13:01:04.122534 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320621-r2bv7" event={"ID":"a0837a49-9f57-447b-8da5-feef49bf42f0","Type":"ContainerDied","Data":"53669880eab2785d8e71198f47bedaff451af04edc1e01ec5c0ee570e645d4a6"} Sep 30 13:01:04 crc kubenswrapper[4672]: I0930 13:01:04.179716 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:01:04 crc kubenswrapper[4672]: I0930 13:01:04.179777 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:01:04 crc kubenswrapper[4672]: I0930 13:01:04.236462 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.541714 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.713305 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-fernet-keys\") pod \"a0837a49-9f57-447b-8da5-feef49bf42f0\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.713356 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-config-data\") pod \"a0837a49-9f57-447b-8da5-feef49bf42f0\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.713537 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2fvh\" (UniqueName: \"kubernetes.io/projected/a0837a49-9f57-447b-8da5-feef49bf42f0-kube-api-access-d2fvh\") pod \"a0837a49-9f57-447b-8da5-feef49bf42f0\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.713618 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-combined-ca-bundle\") pod \"a0837a49-9f57-447b-8da5-feef49bf42f0\" (UID: \"a0837a49-9f57-447b-8da5-feef49bf42f0\") " Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.735906 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0837a49-9f57-447b-8da5-feef49bf42f0-kube-api-access-d2fvh" (OuterVolumeSpecName: "kube-api-access-d2fvh") pod "a0837a49-9f57-447b-8da5-feef49bf42f0" (UID: "a0837a49-9f57-447b-8da5-feef49bf42f0"). InnerVolumeSpecName "kube-api-access-d2fvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.736286 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a0837a49-9f57-447b-8da5-feef49bf42f0" (UID: "a0837a49-9f57-447b-8da5-feef49bf42f0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.766524 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0837a49-9f57-447b-8da5-feef49bf42f0" (UID: "a0837a49-9f57-447b-8da5-feef49bf42f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.787494 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-config-data" (OuterVolumeSpecName: "config-data") pod "a0837a49-9f57-447b-8da5-feef49bf42f0" (UID: "a0837a49-9f57-447b-8da5-feef49bf42f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.816958 4672 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.816998 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.817008 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2fvh\" (UniqueName: \"kubernetes.io/projected/a0837a49-9f57-447b-8da5-feef49bf42f0-kube-api-access-d2fvh\") on node \"crc\" DevicePath \"\"" Sep 30 13:01:05 crc kubenswrapper[4672]: I0930 13:01:05.817023 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0837a49-9f57-447b-8da5-feef49bf42f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:01:06 crc kubenswrapper[4672]: I0930 13:01:06.147012 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320621-r2bv7" Sep 30 13:01:06 crc kubenswrapper[4672]: I0930 13:01:06.147083 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320621-r2bv7" event={"ID":"a0837a49-9f57-447b-8da5-feef49bf42f0","Type":"ContainerDied","Data":"549fce96de129a3f9f86a3bbd0bb2b05d4801910001ef6454b165db398977106"} Sep 30 13:01:06 crc kubenswrapper[4672]: I0930 13:01:06.147807 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="549fce96de129a3f9f86a3bbd0bb2b05d4801910001ef6454b165db398977106" Sep 30 13:01:14 crc kubenswrapper[4672]: I0930 13:01:14.234066 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b2lh4" Sep 30 13:01:14 crc kubenswrapper[4672]: I0930 13:01:14.319961 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2lh4"] Sep 30 13:01:14 crc kubenswrapper[4672]: I0930 13:01:14.370697 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khgj2"] Sep 30 13:01:14 crc kubenswrapper[4672]: I0930 13:01:14.371007 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-khgj2" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="registry-server" containerID="cri-o://b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a" gracePeriod=2 Sep 30 13:01:14 crc kubenswrapper[4672]: I0930 13:01:14.917122 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khgj2" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.023247 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-utilities\") pod \"c706065f-5cf6-4719-96a3-ce442d33c58a\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.023309 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-catalog-content\") pod \"c706065f-5cf6-4719-96a3-ce442d33c58a\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.023586 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw7vb\" (UniqueName: \"kubernetes.io/projected/c706065f-5cf6-4719-96a3-ce442d33c58a-kube-api-access-qw7vb\") pod \"c706065f-5cf6-4719-96a3-ce442d33c58a\" (UID: \"c706065f-5cf6-4719-96a3-ce442d33c58a\") " Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.026128 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-utilities" (OuterVolumeSpecName: "utilities") pod "c706065f-5cf6-4719-96a3-ce442d33c58a" (UID: "c706065f-5cf6-4719-96a3-ce442d33c58a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.039069 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c706065f-5cf6-4719-96a3-ce442d33c58a-kube-api-access-qw7vb" (OuterVolumeSpecName: "kube-api-access-qw7vb") pod "c706065f-5cf6-4719-96a3-ce442d33c58a" (UID: "c706065f-5cf6-4719-96a3-ce442d33c58a"). InnerVolumeSpecName "kube-api-access-qw7vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.104949 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c706065f-5cf6-4719-96a3-ce442d33c58a" (UID: "c706065f-5cf6-4719-96a3-ce442d33c58a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.125584 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.125619 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c706065f-5cf6-4719-96a3-ce442d33c58a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.125631 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw7vb\" (UniqueName: \"kubernetes.io/projected/c706065f-5cf6-4719-96a3-ce442d33c58a-kube-api-access-qw7vb\") on node \"crc\" DevicePath \"\"" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.245700 4672 generic.go:334] "Generic (PLEG): container finished" podID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerID="b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a" exitCode=0 Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.245748 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khgj2" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.245759 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khgj2" event={"ID":"c706065f-5cf6-4719-96a3-ce442d33c58a","Type":"ContainerDied","Data":"b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a"} Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.245804 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khgj2" event={"ID":"c706065f-5cf6-4719-96a3-ce442d33c58a","Type":"ContainerDied","Data":"bc55e211b8e0c0d053448e20a52d30727543a69f007f5a5d747d7ba2f73a9e79"} Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.245826 4672 scope.go:117] "RemoveContainer" containerID="b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.279463 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khgj2"] Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.297808 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-khgj2"] Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.306080 4672 scope.go:117] "RemoveContainer" containerID="210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.336523 4672 scope.go:117] "RemoveContainer" containerID="ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.389443 4672 scope.go:117] "RemoveContainer" containerID="b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a" Sep 30 13:01:15 crc kubenswrapper[4672]: E0930 13:01:15.389891 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a\": container with ID starting with b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a not found: ID does not exist" containerID="b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.389933 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a"} err="failed to get container status \"b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a\": rpc error: code = NotFound desc = could not find container \"b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a\": container with ID starting with b21e8fa3fd4b11f7c5d091e24482505faff123d0c868f487a7fadac3c554069a not found: ID does not exist" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.389961 4672 scope.go:117] "RemoveContainer" containerID="210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca" Sep 30 13:01:15 crc kubenswrapper[4672]: E0930 13:01:15.390210 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca\": container with ID starting with 210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca not found: ID does not exist" containerID="210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.390227 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca"} err="failed to get container status \"210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca\": rpc error: code = NotFound desc = could not find container \"210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca\": container with ID starting with 210ba3a9bb72870093f7357eafc74b919b9f24e752ebbb52bdc4d33b4231a1ca not found: ID does not exist" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.390251 4672 scope.go:117] "RemoveContainer" containerID="ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8" Sep 30 13:01:15 crc kubenswrapper[4672]: E0930 13:01:15.390501 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8\": container with ID starting with ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8 not found: ID does not exist" containerID="ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.390528 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8"} err="failed to get container status \"ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8\": rpc error: code = NotFound desc = could not find container \"ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8\": container with ID starting with ca03667ffcd86234bd5ead67533f1d9d3ac2d16e52025613f5f28d14bbf08fd8 not found: ID does not exist" Sep 30 13:01:15 crc kubenswrapper[4672]: I0930 13:01:15.427115 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" path="/var/lib/kubelet/pods/c706065f-5cf6-4719-96a3-ce442d33c58a/volumes" Sep 30 13:01:24 crc kubenswrapper[4672]: I0930 13:01:24.739554 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:01:24 crc kubenswrapper[4672]: I0930 13:01:24.740225 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:01:24 crc kubenswrapper[4672]: I0930 13:01:24.740291 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:01:24 crc kubenswrapper[4672]: I0930 13:01:24.741090 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:01:24 crc kubenswrapper[4672]: I0930 13:01:24.741159 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" gracePeriod=600 Sep 30 13:01:24 crc kubenswrapper[4672]: E0930 13:01:24.871822 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:01:25 crc kubenswrapper[4672]: I0930 13:01:25.386379 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" exitCode=0 Sep 30 13:01:25 crc kubenswrapper[4672]: I0930 13:01:25.386774 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe"} Sep 30 13:01:25 crc kubenswrapper[4672]: I0930 13:01:25.386820 4672 scope.go:117] "RemoveContainer" containerID="c2787254d23886a5d5ca32405d4feb4336d5e1be3c00953ebe36b2065fda6f64" Sep 30 13:01:25 crc kubenswrapper[4672]: I0930 13:01:25.387776 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:01:25 crc kubenswrapper[4672]: E0930 13:01:25.388101 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:01:37 crc kubenswrapper[4672]: I0930 13:01:37.418489 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:01:37 crc kubenswrapper[4672]: E0930 13:01:37.419351 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:01:52 crc kubenswrapper[4672]: I0930 13:01:52.417104 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:01:52 crc kubenswrapper[4672]: E0930 13:01:52.418228 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:02:03 crc kubenswrapper[4672]: I0930 13:02:03.417355 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:02:03 crc kubenswrapper[4672]: E0930 13:02:03.418383 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:02:17 crc kubenswrapper[4672]: I0930 13:02:17.418038 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:02:17 crc kubenswrapper[4672]: E0930 13:02:17.418997 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:02:28 crc kubenswrapper[4672]: I0930 13:02:28.418007 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:02:28 crc kubenswrapper[4672]: E0930 13:02:28.419161 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:02:32 crc kubenswrapper[4672]: I0930 13:02:32.132177 4672 generic.go:334] "Generic (PLEG): container finished" podID="883bcbaa-0233-4f4d-8463-f451155bc618" containerID="8883d0655966ee5e00cced8c287e2cbd4182fe6f564e365fce56c6b557b933c6" exitCode=0 Sep 30 13:02:32 crc kubenswrapper[4672]: I0930 13:02:32.132349 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" event={"ID":"883bcbaa-0233-4f4d-8463-f451155bc618","Type":"ContainerDied","Data":"8883d0655966ee5e00cced8c287e2cbd4182fe6f564e365fce56c6b557b933c6"} Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.570677 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.603253 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-secret-0\") pod \"883bcbaa-0233-4f4d-8463-f451155bc618\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.604729 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-combined-ca-bundle\") pod \"883bcbaa-0233-4f4d-8463-f451155bc618\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.604913 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwgcf\" (UniqueName: \"kubernetes.io/projected/883bcbaa-0233-4f4d-8463-f451155bc618-kube-api-access-bwgcf\") pod \"883bcbaa-0233-4f4d-8463-f451155bc618\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.604969 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-ssh-key\") pod \"883bcbaa-0233-4f4d-8463-f451155bc618\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.605028 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-inventory\") pod \"883bcbaa-0233-4f4d-8463-f451155bc618\" (UID: \"883bcbaa-0233-4f4d-8463-f451155bc618\") " Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.625176 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "883bcbaa-0233-4f4d-8463-f451155bc618" (UID: "883bcbaa-0233-4f4d-8463-f451155bc618"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.625651 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/883bcbaa-0233-4f4d-8463-f451155bc618-kube-api-access-bwgcf" (OuterVolumeSpecName: "kube-api-access-bwgcf") pod "883bcbaa-0233-4f4d-8463-f451155bc618" (UID: "883bcbaa-0233-4f4d-8463-f451155bc618"). InnerVolumeSpecName "kube-api-access-bwgcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.647856 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "883bcbaa-0233-4f4d-8463-f451155bc618" (UID: "883bcbaa-0233-4f4d-8463-f451155bc618"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.650858 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "883bcbaa-0233-4f4d-8463-f451155bc618" (UID: "883bcbaa-0233-4f4d-8463-f451155bc618"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.652166 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-inventory" (OuterVolumeSpecName: "inventory") pod "883bcbaa-0233-4f4d-8463-f451155bc618" (UID: "883bcbaa-0233-4f4d-8463-f451155bc618"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.708846 4672 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.708886 4672 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.708902 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwgcf\" (UniqueName: \"kubernetes.io/projected/883bcbaa-0233-4f4d-8463-f451155bc618-kube-api-access-bwgcf\") on node \"crc\" DevicePath \"\"" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.708916 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 13:02:33 crc kubenswrapper[4672]: I0930 13:02:33.708929 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883bcbaa-0233-4f4d-8463-f451155bc618-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.154321 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" event={"ID":"883bcbaa-0233-4f4d-8463-f451155bc618","Type":"ContainerDied","Data":"77d462e48b7cba520184be81fa7c64a925db57870f819d2eeed740c1126de299"} Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.154367 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77d462e48b7cba520184be81fa7c64a925db57870f819d2eeed740c1126de299" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.154456 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zq47z" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.286969 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q"] Sep 30 13:02:34 crc kubenswrapper[4672]: E0930 13:02:34.287911 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="extract-content" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.287942 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="extract-content" Sep 30 13:02:34 crc kubenswrapper[4672]: E0930 13:02:34.287983 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0837a49-9f57-447b-8da5-feef49bf42f0" containerName="keystone-cron" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.287997 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0837a49-9f57-447b-8da5-feef49bf42f0" containerName="keystone-cron" Sep 30 13:02:34 crc kubenswrapper[4672]: E0930 13:02:34.288014 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="registry-server" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.288027 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="registry-server" Sep 30 13:02:34 crc kubenswrapper[4672]: E0930 13:02:34.288047 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="extract-utilities" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.288061 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="extract-utilities" Sep 30 13:02:34 crc kubenswrapper[4672]: E0930 13:02:34.288141 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883bcbaa-0233-4f4d-8463-f451155bc618" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.288156 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="883bcbaa-0233-4f4d-8463-f451155bc618" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.288540 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0837a49-9f57-447b-8da5-feef49bf42f0" containerName="keystone-cron" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.288572 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="c706065f-5cf6-4719-96a3-ce442d33c58a" containerName="registry-server" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.288610 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="883bcbaa-0233-4f4d-8463-f451155bc618" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.289821 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.294082 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.294380 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.294613 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.294798 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.295655 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.295819 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.295981 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.298323 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q"] Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.347876 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.347959 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.348030 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.348077 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.348096 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.348132 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.348219 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.348529 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.348574 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2l6\" (UniqueName: \"kubernetes.io/projected/727d1f8a-6b85-4184-b669-3fe8b94c608a-kube-api-access-zv2l6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.450319 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.450817 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.450878 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2l6\" (UniqueName: \"kubernetes.io/projected/727d1f8a-6b85-4184-b669-3fe8b94c608a-kube-api-access-zv2l6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.451027 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.451086 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.451161 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.451249 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.451307 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.451374 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.452604 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.457155 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.457476 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.457690 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.457894 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.458314 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.458755 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.459241 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.483658 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2l6\" (UniqueName: \"kubernetes.io/projected/727d1f8a-6b85-4184-b669-3fe8b94c608a-kube-api-access-zv2l6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qmh4q\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:34 crc kubenswrapper[4672]: I0930 13:02:34.670058 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:02:35 crc kubenswrapper[4672]: I0930 13:02:35.287486 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q"] Sep 30 13:02:36 crc kubenswrapper[4672]: I0930 13:02:36.180506 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" event={"ID":"727d1f8a-6b85-4184-b669-3fe8b94c608a","Type":"ContainerStarted","Data":"3fc55ce2d92a4575e701259bcc6448c33a2dcc2321c78ba35dec1208fdeebf4f"} Sep 30 13:02:37 crc kubenswrapper[4672]: I0930 13:02:37.200630 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" event={"ID":"727d1f8a-6b85-4184-b669-3fe8b94c608a","Type":"ContainerStarted","Data":"a307f84e79c15a39218bd42a0c1c445d6e484780d2a4308499f9dbf867bfe3bc"} Sep 30 13:02:37 crc kubenswrapper[4672]: I0930 13:02:37.223479 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" podStartSLOduration=2.455672563 podStartE2EDuration="3.223461305s" podCreationTimestamp="2025-09-30 13:02:34 +0000 UTC" firstStartedPulling="2025-09-30 13:02:35.323697668 +0000 UTC m=+2446.592935314" lastFinishedPulling="2025-09-30 13:02:36.09148641 +0000 UTC m=+2447.360724056" observedRunningTime="2025-09-30 13:02:37.220588152 +0000 UTC m=+2448.489825808" watchObservedRunningTime="2025-09-30 13:02:37.223461305 +0000 UTC m=+2448.492698951" Sep 30 13:02:41 crc kubenswrapper[4672]: I0930 13:02:41.419153 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:02:41 crc kubenswrapper[4672]: E0930 13:02:41.420057 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:02:56 crc kubenswrapper[4672]: I0930 13:02:56.417093 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:02:56 crc kubenswrapper[4672]: E0930 13:02:56.417832 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:03:08 crc kubenswrapper[4672]: I0930 13:03:08.417585 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:03:08 crc kubenswrapper[4672]: E0930 13:03:08.418628 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:03:23 crc kubenswrapper[4672]: I0930 13:03:23.417368 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:03:23 crc kubenswrapper[4672]: E0930 13:03:23.418475 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:03:36 crc kubenswrapper[4672]: I0930 13:03:36.416901 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:03:36 crc kubenswrapper[4672]: E0930 13:03:36.417925 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:03:51 crc kubenswrapper[4672]: I0930 13:03:51.418505 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:03:51 crc kubenswrapper[4672]: E0930 13:03:51.419856 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:04:06 crc kubenswrapper[4672]: I0930 13:04:06.417979 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:04:06 crc kubenswrapper[4672]: E0930 13:04:06.419183 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:04:20 crc kubenswrapper[4672]: I0930 13:04:20.417048 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:04:20 crc kubenswrapper[4672]: E0930 13:04:20.418945 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:04:31 crc kubenswrapper[4672]: I0930 13:04:31.417574 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:04:31 crc kubenswrapper[4672]: E0930 13:04:31.418554 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:04:46 crc kubenswrapper[4672]: I0930 13:04:46.417723 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:04:46 crc kubenswrapper[4672]: E0930 13:04:46.437663 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:04:59 crc kubenswrapper[4672]: I0930 13:04:59.430340 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:04:59 crc kubenswrapper[4672]: E0930 13:04:59.431960 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:05:13 crc kubenswrapper[4672]: I0930 13:05:13.418541 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:05:13 crc kubenswrapper[4672]: E0930 13:05:13.419640 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:05:24 crc kubenswrapper[4672]: I0930 13:05:24.418331 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:05:24 crc kubenswrapper[4672]: E0930 13:05:24.419856 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:05:38 crc kubenswrapper[4672]: I0930 13:05:38.417181 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:05:38 crc kubenswrapper[4672]: E0930 13:05:38.418019 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:05:49 crc kubenswrapper[4672]: I0930 13:05:49.426340 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:05:49 crc kubenswrapper[4672]: E0930 13:05:49.427472 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:06:04 crc kubenswrapper[4672]: I0930 13:06:04.417367 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:06:04 crc kubenswrapper[4672]: E0930 13:06:04.418400 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:06:16 crc kubenswrapper[4672]: I0930 13:06:16.620064 4672 generic.go:334] "Generic (PLEG): container finished" podID="727d1f8a-6b85-4184-b669-3fe8b94c608a" containerID="a307f84e79c15a39218bd42a0c1c445d6e484780d2a4308499f9dbf867bfe3bc" exitCode=0 Sep 30 13:06:16 crc kubenswrapper[4672]: I0930 13:06:16.620149 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" event={"ID":"727d1f8a-6b85-4184-b669-3fe8b94c608a","Type":"ContainerDied","Data":"a307f84e79c15a39218bd42a0c1c445d6e484780d2a4308499f9dbf867bfe3bc"} Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.090907 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.205873 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-inventory\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.205945 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-extra-config-0\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.205981 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv2l6\" (UniqueName: \"kubernetes.io/projected/727d1f8a-6b85-4184-b669-3fe8b94c608a-kube-api-access-zv2l6\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.206025 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-ssh-key\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.206092 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-combined-ca-bundle\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.206228 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-0\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.206258 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-1\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.206341 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-0\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.206422 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-1\") pod \"727d1f8a-6b85-4184-b669-3fe8b94c608a\" (UID: \"727d1f8a-6b85-4184-b669-3fe8b94c608a\") " Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.220579 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727d1f8a-6b85-4184-b669-3fe8b94c608a-kube-api-access-zv2l6" (OuterVolumeSpecName: "kube-api-access-zv2l6") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "kube-api-access-zv2l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.232657 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.256616 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.260785 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.262617 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.263513 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.266406 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.266792 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.279376 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-inventory" (OuterVolumeSpecName: "inventory") pod "727d1f8a-6b85-4184-b669-3fe8b94c608a" (UID: "727d1f8a-6b85-4184-b669-3fe8b94c608a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308422 4672 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308468 4672 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308487 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308502 4672 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308515 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv2l6\" (UniqueName: \"kubernetes.io/projected/727d1f8a-6b85-4184-b669-3fe8b94c608a-kube-api-access-zv2l6\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308524 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308535 4672 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308547 4672 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.308562 4672 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/727d1f8a-6b85-4184-b669-3fe8b94c608a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.417917 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:06:18 crc kubenswrapper[4672]: E0930 13:06:18.418323 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.659686 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" event={"ID":"727d1f8a-6b85-4184-b669-3fe8b94c608a","Type":"ContainerDied","Data":"3fc55ce2d92a4575e701259bcc6448c33a2dcc2321c78ba35dec1208fdeebf4f"} Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.659735 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fc55ce2d92a4575e701259bcc6448c33a2dcc2321c78ba35dec1208fdeebf4f" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.659745 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qmh4q" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.782124 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5"] Sep 30 13:06:18 crc kubenswrapper[4672]: E0930 13:06:18.782694 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727d1f8a-6b85-4184-b669-3fe8b94c608a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.782717 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="727d1f8a-6b85-4184-b669-3fe8b94c608a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.782980 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="727d1f8a-6b85-4184-b669-3fe8b94c608a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.783821 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.787150 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.787837 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.788014 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.788159 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-b6dbr" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.803011 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.808643 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5"] Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.867481 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.867823 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.867976 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.868081 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.868142 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.868200 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9hq\" (UniqueName: \"kubernetes.io/projected/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-kube-api-access-2l9hq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.868561 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.970430 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.970562 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.971169 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.971234 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.971259 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.971304 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9hq\" (UniqueName: \"kubernetes.io/projected/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-kube-api-access-2l9hq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.971373 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.975641 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.975714 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.976084 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.976566 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.977153 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.978124 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:18 crc kubenswrapper[4672]: I0930 13:06:18.997597 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9hq\" (UniqueName: \"kubernetes.io/projected/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-kube-api-access-2l9hq\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:19 crc kubenswrapper[4672]: I0930 13:06:19.098695 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:06:19 crc kubenswrapper[4672]: I0930 13:06:19.747103 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5"] Sep 30 13:06:19 crc kubenswrapper[4672]: I0930 13:06:19.770042 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:06:20 crc kubenswrapper[4672]: I0930 13:06:20.681296 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" event={"ID":"38c0d8da-6872-4108-aaa8-1b8fa2611fe5","Type":"ContainerStarted","Data":"da67a88449210ca6dc8bb93516c2059cfe1d99949d9f047883267c53f936bfec"} Sep 30 13:06:20 crc kubenswrapper[4672]: I0930 13:06:20.681888 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" event={"ID":"38c0d8da-6872-4108-aaa8-1b8fa2611fe5","Type":"ContainerStarted","Data":"cac3ee5646bad80c82789f97e08c990ee606ce82f1397811fd655be71752f93c"} Sep 30 13:06:20 crc kubenswrapper[4672]: I0930 13:06:20.707886 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" podStartSLOduration=2.102634539 podStartE2EDuration="2.70786146s" podCreationTimestamp="2025-09-30 13:06:18 +0000 UTC" firstStartedPulling="2025-09-30 13:06:19.769851209 +0000 UTC m=+2671.039088855" lastFinishedPulling="2025-09-30 13:06:20.37507809 +0000 UTC m=+2671.644315776" observedRunningTime="2025-09-30 13:06:20.696003048 +0000 UTC m=+2671.965240734" watchObservedRunningTime="2025-09-30 13:06:20.70786146 +0000 UTC m=+2671.977099136" Sep 30 13:06:32 crc kubenswrapper[4672]: I0930 13:06:32.418137 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:06:32 crc kubenswrapper[4672]: I0930 13:06:32.847217 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"f8b15102ab66fbc60ee39b82c175461202eddaf181c58d28542d8bf751915c67"} Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.244155 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4576g"] Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.246881 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.254943 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4576g"] Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.318467 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-catalog-content\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.318579 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nlv\" (UniqueName: \"kubernetes.io/projected/e1966e8b-c91e-4a95-b4ee-4590db07ae16-kube-api-access-z7nlv\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.318615 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-utilities\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.420115 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-catalog-content\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.420517 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nlv\" (UniqueName: \"kubernetes.io/projected/e1966e8b-c91e-4a95-b4ee-4590db07ae16-kube-api-access-z7nlv\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.420561 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-utilities\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.420783 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-catalog-content\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.421007 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-utilities\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.448251 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nlv\" (UniqueName: \"kubernetes.io/projected/e1966e8b-c91e-4a95-b4ee-4590db07ae16-kube-api-access-z7nlv\") pod \"certified-operators-4576g\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:16 crc kubenswrapper[4672]: I0930 13:07:16.616479 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:17 crc kubenswrapper[4672]: I0930 13:07:17.196976 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4576g"] Sep 30 13:07:17 crc kubenswrapper[4672]: I0930 13:07:17.336949 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4576g" event={"ID":"e1966e8b-c91e-4a95-b4ee-4590db07ae16","Type":"ContainerStarted","Data":"e3e9a3b0495defd2aef9fd94202eb61aaba30cf5ba7d29d68656388108e6e8c4"} Sep 30 13:07:18 crc kubenswrapper[4672]: I0930 13:07:18.355475 4672 generic.go:334] "Generic (PLEG): container finished" podID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerID="7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302" exitCode=0 Sep 30 13:07:18 crc kubenswrapper[4672]: I0930 13:07:18.356027 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4576g" event={"ID":"e1966e8b-c91e-4a95-b4ee-4590db07ae16","Type":"ContainerDied","Data":"7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302"} Sep 30 13:07:20 crc kubenswrapper[4672]: I0930 13:07:20.385457 4672 generic.go:334] "Generic (PLEG): container finished" podID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerID="5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3" exitCode=0 Sep 30 13:07:20 crc kubenswrapper[4672]: I0930 13:07:20.385960 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4576g" event={"ID":"e1966e8b-c91e-4a95-b4ee-4590db07ae16","Type":"ContainerDied","Data":"5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3"} Sep 30 13:07:22 crc kubenswrapper[4672]: I0930 13:07:22.408608 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4576g" event={"ID":"e1966e8b-c91e-4a95-b4ee-4590db07ae16","Type":"ContainerStarted","Data":"22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217"} Sep 30 13:07:22 crc kubenswrapper[4672]: I0930 13:07:22.435075 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4576g" podStartSLOduration=3.513000561 podStartE2EDuration="6.435055673s" podCreationTimestamp="2025-09-30 13:07:16 +0000 UTC" firstStartedPulling="2025-09-30 13:07:18.368611388 +0000 UTC m=+2729.637849034" lastFinishedPulling="2025-09-30 13:07:21.2906665 +0000 UTC m=+2732.559904146" observedRunningTime="2025-09-30 13:07:22.429496971 +0000 UTC m=+2733.698734617" watchObservedRunningTime="2025-09-30 13:07:22.435055673 +0000 UTC m=+2733.704293319" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.720799 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7ntq"] Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.725864 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.741496 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7ntq"] Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.810898 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-catalog-content\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.811200 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-utilities\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.811648 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8w5\" (UniqueName: \"kubernetes.io/projected/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-kube-api-access-zr8w5\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.913707 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8w5\" (UniqueName: \"kubernetes.io/projected/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-kube-api-access-zr8w5\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.914060 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-catalog-content\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.914294 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-utilities\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.914645 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-catalog-content\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.914774 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-utilities\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:24 crc kubenswrapper[4672]: I0930 13:07:24.942238 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8w5\" (UniqueName: \"kubernetes.io/projected/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-kube-api-access-zr8w5\") pod \"redhat-operators-g7ntq\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:25 crc kubenswrapper[4672]: I0930 13:07:25.054319 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:25 crc kubenswrapper[4672]: I0930 13:07:25.553799 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7ntq"] Sep 30 13:07:25 crc kubenswrapper[4672]: W0930 13:07:25.564200 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f8c57e9_cdd1_4ae4_afb4_4664662587f1.slice/crio-8ad90e827eafd4f943e4a1dd072c6ef975f75e2ef0103f11c078f58b6cd0c9f9 WatchSource:0}: Error finding container 8ad90e827eafd4f943e4a1dd072c6ef975f75e2ef0103f11c078f58b6cd0c9f9: Status 404 returned error can't find the container with id 8ad90e827eafd4f943e4a1dd072c6ef975f75e2ef0103f11c078f58b6cd0c9f9 Sep 30 13:07:26 crc kubenswrapper[4672]: I0930 13:07:26.444528 4672 generic.go:334] "Generic (PLEG): container finished" podID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerID="376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639" exitCode=0 Sep 30 13:07:26 crc kubenswrapper[4672]: I0930 13:07:26.444588 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ntq" event={"ID":"0f8c57e9-cdd1-4ae4-afb4-4664662587f1","Type":"ContainerDied","Data":"376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639"} Sep 30 13:07:26 crc kubenswrapper[4672]: I0930 13:07:26.444834 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ntq" event={"ID":"0f8c57e9-cdd1-4ae4-afb4-4664662587f1","Type":"ContainerStarted","Data":"8ad90e827eafd4f943e4a1dd072c6ef975f75e2ef0103f11c078f58b6cd0c9f9"} Sep 30 13:07:26 crc kubenswrapper[4672]: I0930 13:07:26.617859 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:26 crc kubenswrapper[4672]: I0930 13:07:26.617932 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:26 crc kubenswrapper[4672]: I0930 13:07:26.668489 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:27 crc kubenswrapper[4672]: I0930 13:07:27.455408 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ntq" event={"ID":"0f8c57e9-cdd1-4ae4-afb4-4664662587f1","Type":"ContainerStarted","Data":"05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75"} Sep 30 13:07:27 crc kubenswrapper[4672]: I0930 13:07:27.515621 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:28 crc kubenswrapper[4672]: I0930 13:07:28.465658 4672 generic.go:334] "Generic (PLEG): container finished" podID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerID="05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75" exitCode=0 Sep 30 13:07:28 crc kubenswrapper[4672]: I0930 13:07:28.465800 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ntq" event={"ID":"0f8c57e9-cdd1-4ae4-afb4-4664662587f1","Type":"ContainerDied","Data":"05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75"} Sep 30 13:07:29 crc kubenswrapper[4672]: I0930 13:07:29.103287 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4576g"] Sep 30 13:07:29 crc kubenswrapper[4672]: I0930 13:07:29.477822 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ntq" event={"ID":"0f8c57e9-cdd1-4ae4-afb4-4664662587f1","Type":"ContainerStarted","Data":"a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf"} Sep 30 13:07:29 crc kubenswrapper[4672]: I0930 13:07:29.478000 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4576g" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="registry-server" containerID="cri-o://22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217" gracePeriod=2 Sep 30 13:07:29 crc kubenswrapper[4672]: I0930 13:07:29.502253 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7ntq" podStartSLOduration=2.978646266 podStartE2EDuration="5.502228181s" podCreationTimestamp="2025-09-30 13:07:24 +0000 UTC" firstStartedPulling="2025-09-30 13:07:26.446427471 +0000 UTC m=+2737.715665137" lastFinishedPulling="2025-09-30 13:07:28.970009406 +0000 UTC m=+2740.239247052" observedRunningTime="2025-09-30 13:07:29.499689556 +0000 UTC m=+2740.768927222" watchObservedRunningTime="2025-09-30 13:07:29.502228181 +0000 UTC m=+2740.771465827" Sep 30 13:07:29 crc kubenswrapper[4672]: I0930 13:07:29.987709 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.118560 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-utilities\") pod \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.118791 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7nlv\" (UniqueName: \"kubernetes.io/projected/e1966e8b-c91e-4a95-b4ee-4590db07ae16-kube-api-access-z7nlv\") pod \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.118913 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-catalog-content\") pod \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\" (UID: \"e1966e8b-c91e-4a95-b4ee-4590db07ae16\") " Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.119213 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-utilities" (OuterVolumeSpecName: "utilities") pod "e1966e8b-c91e-4a95-b4ee-4590db07ae16" (UID: "e1966e8b-c91e-4a95-b4ee-4590db07ae16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.119485 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.124754 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1966e8b-c91e-4a95-b4ee-4590db07ae16-kube-api-access-z7nlv" (OuterVolumeSpecName: "kube-api-access-z7nlv") pod "e1966e8b-c91e-4a95-b4ee-4590db07ae16" (UID: "e1966e8b-c91e-4a95-b4ee-4590db07ae16"). InnerVolumeSpecName "kube-api-access-z7nlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.171874 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1966e8b-c91e-4a95-b4ee-4590db07ae16" (UID: "e1966e8b-c91e-4a95-b4ee-4590db07ae16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.221855 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1966e8b-c91e-4a95-b4ee-4590db07ae16-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.221913 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7nlv\" (UniqueName: \"kubernetes.io/projected/e1966e8b-c91e-4a95-b4ee-4590db07ae16-kube-api-access-z7nlv\") on node \"crc\" DevicePath \"\"" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.487655 4672 generic.go:334] "Generic (PLEG): container finished" podID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerID="22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217" exitCode=0 Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.487694 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4576g" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.487713 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4576g" event={"ID":"e1966e8b-c91e-4a95-b4ee-4590db07ae16","Type":"ContainerDied","Data":"22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217"} Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.488141 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4576g" event={"ID":"e1966e8b-c91e-4a95-b4ee-4590db07ae16","Type":"ContainerDied","Data":"e3e9a3b0495defd2aef9fd94202eb61aaba30cf5ba7d29d68656388108e6e8c4"} Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.488161 4672 scope.go:117] "RemoveContainer" containerID="22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.506409 4672 scope.go:117] "RemoveContainer" containerID="5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.535989 4672 scope.go:117] "RemoveContainer" containerID="7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.554752 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4576g"] Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.572666 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4576g"] Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.588670 4672 scope.go:117] "RemoveContainer" containerID="22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217" Sep 30 13:07:30 crc kubenswrapper[4672]: E0930 13:07:30.589422 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217\": container with ID starting with 22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217 not found: ID does not exist" containerID="22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.589456 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217"} err="failed to get container status \"22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217\": rpc error: code = NotFound desc = could not find container \"22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217\": container with ID starting with 22aeab881aef8bbc783067fd0758d42057dd6223b48c8e8972afcc512c3fe217 not found: ID does not exist" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.589478 4672 scope.go:117] "RemoveContainer" containerID="5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3" Sep 30 13:07:30 crc kubenswrapper[4672]: E0930 13:07:30.589765 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3\": container with ID starting with 5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3 not found: ID does not exist" containerID="5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.589786 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3"} err="failed to get container status \"5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3\": rpc error: code = NotFound desc = could not find container \"5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3\": container with ID starting with 5378c533b797f3fb4a36e89fb83a2c0f36e1c82333ff8734ccff9b2a00e91dd3 not found: ID does not exist" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.589800 4672 scope.go:117] "RemoveContainer" containerID="7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302" Sep 30 13:07:30 crc kubenswrapper[4672]: E0930 13:07:30.590068 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302\": container with ID starting with 7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302 not found: ID does not exist" containerID="7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302" Sep 30 13:07:30 crc kubenswrapper[4672]: I0930 13:07:30.590093 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302"} err="failed to get container status \"7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302\": rpc error: code = NotFound desc = could not find container \"7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302\": container with ID starting with 7f5f1a8f3164088528a6b668f2966123d32fd2df1b4b265879975b1a86584302 not found: ID does not exist" Sep 30 13:07:31 crc kubenswrapper[4672]: I0930 13:07:31.430239 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" path="/var/lib/kubelet/pods/e1966e8b-c91e-4a95-b4ee-4590db07ae16/volumes" Sep 30 13:07:35 crc kubenswrapper[4672]: I0930 13:07:35.055431 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:35 crc kubenswrapper[4672]: I0930 13:07:35.055759 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:36 crc kubenswrapper[4672]: I0930 13:07:36.099371 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7ntq" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="registry-server" probeResult="failure" output=< Sep 30 13:07:36 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Sep 30 13:07:36 crc kubenswrapper[4672]: > Sep 30 13:07:45 crc kubenswrapper[4672]: I0930 13:07:45.103488 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:45 crc kubenswrapper[4672]: I0930 13:07:45.153253 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:45 crc kubenswrapper[4672]: I0930 13:07:45.343901 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7ntq"] Sep 30 13:07:46 crc kubenswrapper[4672]: I0930 13:07:46.661596 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7ntq" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="registry-server" containerID="cri-o://a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf" gracePeriod=2 Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.112931 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.277343 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-utilities\") pod \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.277984 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-catalog-content\") pod \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.278199 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr8w5\" (UniqueName: \"kubernetes.io/projected/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-kube-api-access-zr8w5\") pod \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\" (UID: \"0f8c57e9-cdd1-4ae4-afb4-4664662587f1\") " Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.278621 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-utilities" (OuterVolumeSpecName: "utilities") pod "0f8c57e9-cdd1-4ae4-afb4-4664662587f1" (UID: "0f8c57e9-cdd1-4ae4-afb4-4664662587f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.279554 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.288827 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-kube-api-access-zr8w5" (OuterVolumeSpecName: "kube-api-access-zr8w5") pod "0f8c57e9-cdd1-4ae4-afb4-4664662587f1" (UID: "0f8c57e9-cdd1-4ae4-afb4-4664662587f1"). InnerVolumeSpecName "kube-api-access-zr8w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.377095 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f8c57e9-cdd1-4ae4-afb4-4664662587f1" (UID: "0f8c57e9-cdd1-4ae4-afb4-4664662587f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.381795 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.381910 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr8w5\" (UniqueName: \"kubernetes.io/projected/0f8c57e9-cdd1-4ae4-afb4-4664662587f1-kube-api-access-zr8w5\") on node \"crc\" DevicePath \"\"" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.677817 4672 generic.go:334] "Generic (PLEG): container finished" podID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerID="a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf" exitCode=0 Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.677863 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ntq" event={"ID":"0f8c57e9-cdd1-4ae4-afb4-4664662587f1","Type":"ContainerDied","Data":"a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf"} Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.677889 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ntq" event={"ID":"0f8c57e9-cdd1-4ae4-afb4-4664662587f1","Type":"ContainerDied","Data":"8ad90e827eafd4f943e4a1dd072c6ef975f75e2ef0103f11c078f58b6cd0c9f9"} Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.677905 4672 scope.go:117] "RemoveContainer" containerID="a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.678063 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ntq" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.702820 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7ntq"] Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.703060 4672 scope.go:117] "RemoveContainer" containerID="05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.710054 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7ntq"] Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.739604 4672 scope.go:117] "RemoveContainer" containerID="376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.770383 4672 scope.go:117] "RemoveContainer" containerID="a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf" Sep 30 13:07:47 crc kubenswrapper[4672]: E0930 13:07:47.770937 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf\": container with ID starting with a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf not found: ID does not exist" containerID="a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.770977 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf"} err="failed to get container status \"a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf\": rpc error: code = NotFound desc = could not find container \"a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf\": container with ID starting with a945278a05a7a363a9897316c7f4d1a1ce8626311c14b5daf9a6ef53bc1cbeaf not found: ID does not exist" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.771004 4672 scope.go:117] "RemoveContainer" containerID="05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75" Sep 30 13:07:47 crc kubenswrapper[4672]: E0930 13:07:47.771420 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75\": container with ID starting with 05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75 not found: ID does not exist" containerID="05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.771448 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75"} err="failed to get container status \"05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75\": rpc error: code = NotFound desc = could not find container \"05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75\": container with ID starting with 05103f88a869bfc27783739da22c414893fc1bbc58d5448d56c1048686205f75 not found: ID does not exist" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.771463 4672 scope.go:117] "RemoveContainer" containerID="376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639" Sep 30 13:07:47 crc kubenswrapper[4672]: E0930 13:07:47.771848 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639\": container with ID starting with 376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639 not found: ID does not exist" containerID="376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639" Sep 30 13:07:47 crc kubenswrapper[4672]: I0930 13:07:47.771873 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639"} err="failed to get container status \"376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639\": rpc error: code = NotFound desc = could not find container \"376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639\": container with ID starting with 376e1ee15fe5e6e8cca65e8e04d89b75912c7a407a281897aa083d253ff44639 not found: ID does not exist" Sep 30 13:07:49 crc kubenswrapper[4672]: I0930 13:07:49.432340 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" path="/var/lib/kubelet/pods/0f8c57e9-cdd1-4ae4-afb4-4664662587f1/volumes" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.383219 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cvvgl"] Sep 30 13:07:59 crc kubenswrapper[4672]: E0930 13:07:59.384820 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="registry-server" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.384848 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="registry-server" Sep 30 13:07:59 crc kubenswrapper[4672]: E0930 13:07:59.384872 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="extract-content" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.384885 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="extract-content" Sep 30 13:07:59 crc kubenswrapper[4672]: E0930 13:07:59.384910 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="registry-server" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.384924 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="registry-server" Sep 30 13:07:59 crc kubenswrapper[4672]: E0930 13:07:59.384941 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="extract-utilities" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.384954 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="extract-utilities" Sep 30 13:07:59 crc kubenswrapper[4672]: E0930 13:07:59.384977 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="extract-utilities" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.384991 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="extract-utilities" Sep 30 13:07:59 crc kubenswrapper[4672]: E0930 13:07:59.385039 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="extract-content" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.385053 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="extract-content" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.385485 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8c57e9-cdd1-4ae4-afb4-4664662587f1" containerName="registry-server" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.385521 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1966e8b-c91e-4a95-b4ee-4590db07ae16" containerName="registry-server" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.388517 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.397089 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvvgl"] Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.471602 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwl6d\" (UniqueName: \"kubernetes.io/projected/dffe357c-50c0-441b-8ea1-62e351fb9571-kube-api-access-dwl6d\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.471730 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-utilities\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.471780 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-catalog-content\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.573859 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-utilities\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.574294 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-catalog-content\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.574400 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-utilities\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.574497 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwl6d\" (UniqueName: \"kubernetes.io/projected/dffe357c-50c0-441b-8ea1-62e351fb9571-kube-api-access-dwl6d\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.574860 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-catalog-content\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.603538 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwl6d\" (UniqueName: \"kubernetes.io/projected/dffe357c-50c0-441b-8ea1-62e351fb9571-kube-api-access-dwl6d\") pod \"redhat-marketplace-cvvgl\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:07:59 crc kubenswrapper[4672]: I0930 13:07:59.727110 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:08:00 crc kubenswrapper[4672]: I0930 13:08:00.218288 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvvgl"] Sep 30 13:08:00 crc kubenswrapper[4672]: I0930 13:08:00.819813 4672 generic.go:334] "Generic (PLEG): container finished" podID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerID="a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527" exitCode=0 Sep 30 13:08:00 crc kubenswrapper[4672]: I0930 13:08:00.820048 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvvgl" event={"ID":"dffe357c-50c0-441b-8ea1-62e351fb9571","Type":"ContainerDied","Data":"a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527"} Sep 30 13:08:00 crc kubenswrapper[4672]: I0930 13:08:00.820145 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvvgl" event={"ID":"dffe357c-50c0-441b-8ea1-62e351fb9571","Type":"ContainerStarted","Data":"d6c73424f9d7c67dea030ebc3fc37d6d70355f6f088d1a5c59dc1619c2bf72f7"} Sep 30 13:08:01 crc kubenswrapper[4672]: I0930 13:08:01.830641 4672 generic.go:334] "Generic (PLEG): container finished" podID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerID="e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a" exitCode=0 Sep 30 13:08:01 crc kubenswrapper[4672]: I0930 13:08:01.830727 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvvgl" event={"ID":"dffe357c-50c0-441b-8ea1-62e351fb9571","Type":"ContainerDied","Data":"e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a"} Sep 30 13:08:02 crc kubenswrapper[4672]: I0930 13:08:02.848830 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvvgl" event={"ID":"dffe357c-50c0-441b-8ea1-62e351fb9571","Type":"ContainerStarted","Data":"4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457"} Sep 30 13:08:02 crc kubenswrapper[4672]: I0930 13:08:02.873693 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cvvgl" podStartSLOduration=2.400363649 podStartE2EDuration="3.873677304s" podCreationTimestamp="2025-09-30 13:07:59 +0000 UTC" firstStartedPulling="2025-09-30 13:08:00.821876962 +0000 UTC m=+2772.091114608" lastFinishedPulling="2025-09-30 13:08:02.295190617 +0000 UTC m=+2773.564428263" observedRunningTime="2025-09-30 13:08:02.869525737 +0000 UTC m=+2774.138763393" watchObservedRunningTime="2025-09-30 13:08:02.873677304 +0000 UTC m=+2774.142914940" Sep 30 13:08:09 crc kubenswrapper[4672]: I0930 13:08:09.728028 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:08:09 crc kubenswrapper[4672]: I0930 13:08:09.728616 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:08:09 crc kubenswrapper[4672]: I0930 13:08:09.775343 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:08:09 crc kubenswrapper[4672]: I0930 13:08:09.987869 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:08:10 crc kubenswrapper[4672]: I0930 13:08:10.047080 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvvgl"] Sep 30 13:08:11 crc kubenswrapper[4672]: I0930 13:08:11.959124 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cvvgl" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="registry-server" containerID="cri-o://4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457" gracePeriod=2 Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.494339 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.639516 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwl6d\" (UniqueName: \"kubernetes.io/projected/dffe357c-50c0-441b-8ea1-62e351fb9571-kube-api-access-dwl6d\") pod \"dffe357c-50c0-441b-8ea1-62e351fb9571\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.639575 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-catalog-content\") pod \"dffe357c-50c0-441b-8ea1-62e351fb9571\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.639685 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-utilities\") pod \"dffe357c-50c0-441b-8ea1-62e351fb9571\" (UID: \"dffe357c-50c0-441b-8ea1-62e351fb9571\") " Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.640931 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-utilities" (OuterVolumeSpecName: "utilities") pod "dffe357c-50c0-441b-8ea1-62e351fb9571" (UID: "dffe357c-50c0-441b-8ea1-62e351fb9571"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.653648 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dffe357c-50c0-441b-8ea1-62e351fb9571-kube-api-access-dwl6d" (OuterVolumeSpecName: "kube-api-access-dwl6d") pod "dffe357c-50c0-441b-8ea1-62e351fb9571" (UID: "dffe357c-50c0-441b-8ea1-62e351fb9571"). InnerVolumeSpecName "kube-api-access-dwl6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.655315 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dffe357c-50c0-441b-8ea1-62e351fb9571" (UID: "dffe357c-50c0-441b-8ea1-62e351fb9571"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.742649 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.742693 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffe357c-50c0-441b-8ea1-62e351fb9571-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.742706 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwl6d\" (UniqueName: \"kubernetes.io/projected/dffe357c-50c0-441b-8ea1-62e351fb9571-kube-api-access-dwl6d\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.972738 4672 generic.go:334] "Generic (PLEG): container finished" podID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerID="4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457" exitCode=0 Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.972819 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvvgl" Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.972816 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvvgl" event={"ID":"dffe357c-50c0-441b-8ea1-62e351fb9571","Type":"ContainerDied","Data":"4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457"} Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.972974 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvvgl" event={"ID":"dffe357c-50c0-441b-8ea1-62e351fb9571","Type":"ContainerDied","Data":"d6c73424f9d7c67dea030ebc3fc37d6d70355f6f088d1a5c59dc1619c2bf72f7"} Sep 30 13:08:12 crc kubenswrapper[4672]: I0930 13:08:12.973000 4672 scope.go:117] "RemoveContainer" containerID="4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.013602 4672 scope.go:117] "RemoveContainer" containerID="e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.017892 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvvgl"] Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.029331 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvvgl"] Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.038454 4672 scope.go:117] "RemoveContainer" containerID="a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.091525 4672 scope.go:117] "RemoveContainer" containerID="4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457" Sep 30 13:08:13 crc kubenswrapper[4672]: E0930 13:08:13.092100 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457\": container with ID starting with 4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457 not found: ID does not exist" containerID="4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.092148 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457"} err="failed to get container status \"4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457\": rpc error: code = NotFound desc = could not find container \"4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457\": container with ID starting with 4e5e6583314cda2aa816d9721bb25c7b02d6d0683449a734c2e7d1d0a8853457 not found: ID does not exist" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.092175 4672 scope.go:117] "RemoveContainer" containerID="e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a" Sep 30 13:08:13 crc kubenswrapper[4672]: E0930 13:08:13.092654 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a\": container with ID starting with e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a not found: ID does not exist" containerID="e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.092688 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a"} err="failed to get container status \"e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a\": rpc error: code = NotFound desc = could not find container \"e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a\": container with ID starting with e63a5e70f683da3adab2d60a26ddd739a643646020a6c490512f34829d63ad9a not found: ID does not exist" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.092710 4672 scope.go:117] "RemoveContainer" containerID="a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527" Sep 30 13:08:13 crc kubenswrapper[4672]: E0930 13:08:13.093000 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527\": container with ID starting with a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527 not found: ID does not exist" containerID="a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.093035 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527"} err="failed to get container status \"a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527\": rpc error: code = NotFound desc = could not find container \"a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527\": container with ID starting with a4dff6403248f4b81e0e6482e51553fcdae4565471f0ada1e044798c529a7527 not found: ID does not exist" Sep 30 13:08:13 crc kubenswrapper[4672]: I0930 13:08:13.428056 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" path="/var/lib/kubelet/pods/dffe357c-50c0-441b-8ea1-62e351fb9571/volumes" Sep 30 13:08:54 crc kubenswrapper[4672]: E0930 13:08:54.386241 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38c0d8da_6872_4108_aaa8_1b8fa2611fe5.slice/crio-conmon-da67a88449210ca6dc8bb93516c2059cfe1d99949d9f047883267c53f936bfec.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:08:54 crc kubenswrapper[4672]: I0930 13:08:54.408234 4672 generic.go:334] "Generic (PLEG): container finished" podID="38c0d8da-6872-4108-aaa8-1b8fa2611fe5" containerID="da67a88449210ca6dc8bb93516c2059cfe1d99949d9f047883267c53f936bfec" exitCode=0 Sep 30 13:08:54 crc kubenswrapper[4672]: I0930 13:08:54.408289 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" event={"ID":"38c0d8da-6872-4108-aaa8-1b8fa2611fe5","Type":"ContainerDied","Data":"da67a88449210ca6dc8bb93516c2059cfe1d99949d9f047883267c53f936bfec"} Sep 30 13:08:54 crc kubenswrapper[4672]: I0930 13:08:54.739912 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:08:54 crc kubenswrapper[4672]: I0930 13:08:54.740017 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.848973 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.995568 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-inventory\") pod \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.995647 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ssh-key\") pod \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.995703 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-2\") pod \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.995747 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-0\") pod \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.995802 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l9hq\" (UniqueName: \"kubernetes.io/projected/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-kube-api-access-2l9hq\") pod \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.995828 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-telemetry-combined-ca-bundle\") pod \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " Sep 30 13:08:55 crc kubenswrapper[4672]: I0930 13:08:55.995951 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-1\") pod \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\" (UID: \"38c0d8da-6872-4108-aaa8-1b8fa2611fe5\") " Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.005038 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-kube-api-access-2l9hq" (OuterVolumeSpecName: "kube-api-access-2l9hq") pod "38c0d8da-6872-4108-aaa8-1b8fa2611fe5" (UID: "38c0d8da-6872-4108-aaa8-1b8fa2611fe5"). InnerVolumeSpecName "kube-api-access-2l9hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.010580 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "38c0d8da-6872-4108-aaa8-1b8fa2611fe5" (UID: "38c0d8da-6872-4108-aaa8-1b8fa2611fe5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.029819 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "38c0d8da-6872-4108-aaa8-1b8fa2611fe5" (UID: "38c0d8da-6872-4108-aaa8-1b8fa2611fe5"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.029854 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-inventory" (OuterVolumeSpecName: "inventory") pod "38c0d8da-6872-4108-aaa8-1b8fa2611fe5" (UID: "38c0d8da-6872-4108-aaa8-1b8fa2611fe5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.036812 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "38c0d8da-6872-4108-aaa8-1b8fa2611fe5" (UID: "38c0d8da-6872-4108-aaa8-1b8fa2611fe5"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.037490 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "38c0d8da-6872-4108-aaa8-1b8fa2611fe5" (UID: "38c0d8da-6872-4108-aaa8-1b8fa2611fe5"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.039463 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "38c0d8da-6872-4108-aaa8-1b8fa2611fe5" (UID: "38c0d8da-6872-4108-aaa8-1b8fa2611fe5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.098848 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.099185 4672 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.099343 4672 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.099455 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l9hq\" (UniqueName: \"kubernetes.io/projected/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-kube-api-access-2l9hq\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.099533 4672 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.099631 4672 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.099714 4672 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38c0d8da-6872-4108-aaa8-1b8fa2611fe5-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.429243 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.429240 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5" event={"ID":"38c0d8da-6872-4108-aaa8-1b8fa2611fe5","Type":"ContainerDied","Data":"cac3ee5646bad80c82789f97e08c990ee606ce82f1397811fd655be71752f93c"} Sep 30 13:08:56 crc kubenswrapper[4672]: I0930 13:08:56.429471 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cac3ee5646bad80c82789f97e08c990ee606ce82f1397811fd655be71752f93c" Sep 30 13:09:24 crc kubenswrapper[4672]: I0930 13:09:24.739625 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:09:24 crc kubenswrapper[4672]: I0930 13:09:24.740210 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:09:34 crc kubenswrapper[4672]: I0930 13:09:34.908757 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 13:09:34 crc kubenswrapper[4672]: I0930 13:09:34.909553 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="prometheus" containerID="cri-o://1a79e4562c4e332c07785fa3a46d2c31bee965d0a2cf8e35e3549ba2e4379334" gracePeriod=600 Sep 30 13:09:34 crc kubenswrapper[4672]: I0930 13:09:34.909596 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="thanos-sidecar" containerID="cri-o://fbb61f2821a324820bd79c008b10693dcee866e69aff58ca9caf772f6a483ae4" gracePeriod=600 Sep 30 13:09:34 crc kubenswrapper[4672]: I0930 13:09:34.909658 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="config-reloader" containerID="cri-o://47aaeacfe5c09a9093d0d2b569d0c92744a18d97edfb1848c3bafa3e6462e1ef" gracePeriod=600 Sep 30 13:09:35 crc kubenswrapper[4672]: E0930 13:09:35.475630 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf82624d_33e5_4298_8cd9_53ef50e87f12.slice/crio-conmon-47aaeacfe5c09a9093d0d2b569d0c92744a18d97edfb1848c3bafa3e6462e1ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf82624d_33e5_4298_8cd9_53ef50e87f12.slice/crio-47aaeacfe5c09a9093d0d2b569d0c92744a18d97edfb1848c3bafa3e6462e1ef.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864328 4672 generic.go:334] "Generic (PLEG): container finished" podID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerID="fbb61f2821a324820bd79c008b10693dcee866e69aff58ca9caf772f6a483ae4" exitCode=0 Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864676 4672 generic.go:334] "Generic (PLEG): container finished" podID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerID="47aaeacfe5c09a9093d0d2b569d0c92744a18d97edfb1848c3bafa3e6462e1ef" exitCode=0 Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864685 4672 generic.go:334] "Generic (PLEG): container finished" podID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerID="1a79e4562c4e332c07785fa3a46d2c31bee965d0a2cf8e35e3549ba2e4379334" exitCode=0 Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864705 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerDied","Data":"fbb61f2821a324820bd79c008b10693dcee866e69aff58ca9caf772f6a483ae4"} Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864731 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerDied","Data":"47aaeacfe5c09a9093d0d2b569d0c92744a18d97edfb1848c3bafa3e6462e1ef"} Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864743 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerDied","Data":"1a79e4562c4e332c07785fa3a46d2c31bee965d0a2cf8e35e3549ba2e4379334"} Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864752 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf82624d-33e5-4298-8cd9-53ef50e87f12","Type":"ContainerDied","Data":"b32e02bce319b498b182bb6caa3a71d8c75d043cdbfa2f9863d6d1eaa1803627"} Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.864761 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b32e02bce319b498b182bb6caa3a71d8c75d043cdbfa2f9863d6d1eaa1803627" Sep 30 13:09:35 crc kubenswrapper[4672]: I0930 13:09:35.905728 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.081896 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9x6r\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-kube-api-access-p9x6r\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082087 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-secret-combined-ca-bundle\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082123 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082176 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-config\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082206 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082242 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-thanos-prometheus-http-client-file\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082331 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-tls-assets\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082385 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf82624d-33e5-4298-8cd9-53ef50e87f12-config-out\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082544 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082884 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf82624d-33e5-4298-8cd9-53ef50e87f12-prometheus-metric-storage-rulefiles-0\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.082954 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"cf82624d-33e5-4298-8cd9-53ef50e87f12\" (UID: \"cf82624d-33e5-4298-8cd9-53ef50e87f12\") " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.083436 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf82624d-33e5-4298-8cd9-53ef50e87f12-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.089477 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-kube-api-access-p9x6r" (OuterVolumeSpecName: "kube-api-access-p9x6r") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "kube-api-access-p9x6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.089511 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.089564 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.090003 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-config" (OuterVolumeSpecName: "config") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.090079 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.092518 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.093786 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf82624d-33e5-4298-8cd9-53ef50e87f12-config-out" (OuterVolumeSpecName: "config-out") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.104537 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.112178 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "pvc-590f2ab7-9519-4427-af7c-904176dbd8cb". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.184923 4672 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf82624d-33e5-4298-8cd9-53ef50e87f12-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.184989 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") on node \"crc\" " Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185007 4672 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf82624d-33e5-4298-8cd9-53ef50e87f12-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185024 4672 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185039 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9x6r\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-kube-api-access-p9x6r\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185053 4672 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185066 4672 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185078 4672 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185090 4672 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.185103 4672 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf82624d-33e5-4298-8cd9-53ef50e87f12-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.221496 4672 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.222258 4672 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-590f2ab7-9519-4427-af7c-904176dbd8cb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb") on node "crc" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.232250 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config" (OuterVolumeSpecName: "web-config") pod "cf82624d-33e5-4298-8cd9-53ef50e87f12" (UID: "cf82624d-33e5-4298-8cd9-53ef50e87f12"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.286924 4672 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf82624d-33e5-4298-8cd9-53ef50e87f12-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.286963 4672 reconciler_common.go:293] "Volume detached for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") on node \"crc\" DevicePath \"\"" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.872329 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.908791 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.919709 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940316 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940786 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c0d8da-6872-4108-aaa8-1b8fa2611fe5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940812 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c0d8da-6872-4108-aaa8-1b8fa2611fe5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940829 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="registry-server" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940837 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="registry-server" Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940867 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="extract-content" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940875 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="extract-content" Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940889 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="config-reloader" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940897 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="config-reloader" Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940924 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="extract-utilities" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940933 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="extract-utilities" Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940948 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="init-config-reloader" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940957 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="init-config-reloader" Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940972 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="thanos-sidecar" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940980 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="thanos-sidecar" Sep 30 13:09:36 crc kubenswrapper[4672]: E0930 13:09:36.940990 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="prometheus" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.940998 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="prometheus" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.941216 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c0d8da-6872-4108-aaa8-1b8fa2611fe5" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.941237 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="config-reloader" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.941251 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="prometheus" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.941284 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffe357c-50c0-441b-8ea1-62e351fb9571" containerName="registry-server" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.941296 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" containerName="thanos-sidecar" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.947544 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.953612 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.953651 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.953709 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.953612 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.954231 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-l8qzv" Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.975120 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 13:09:36 crc kubenswrapper[4672]: I0930 13:09:36.976791 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120391 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120461 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120494 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120520 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120580 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120673 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cks7j\" (UniqueName: \"kubernetes.io/projected/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-kube-api-access-cks7j\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120694 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120716 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120734 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120811 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.120854 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222417 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222470 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222522 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222564 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222591 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222616 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222658 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222737 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cks7j\" (UniqueName: \"kubernetes.io/projected/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-kube-api-access-cks7j\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222791 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222820 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.222846 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.223913 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.234852 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.235480 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.235823 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.236558 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.238950 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.239243 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.239801 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.245786 4672 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.245847 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/13f5ba172743275de48f8b63cc56ba623f099037d0437073c4c58e3661633e39/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.246559 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.255723 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cks7j\" (UniqueName: \"kubernetes.io/projected/5a41d16c-3326-4d4f-a01a-0f8c436aa9b0-kube-api-access-cks7j\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.336853 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-590f2ab7-9519-4427-af7c-904176dbd8cb\") pod \"prometheus-metric-storage-0\" (UID: \"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0\") " pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.427873 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf82624d-33e5-4298-8cd9-53ef50e87f12" path="/var/lib/kubelet/pods/cf82624d-33e5-4298-8cd9-53ef50e87f12/volumes" Sep 30 13:09:37 crc kubenswrapper[4672]: I0930 13:09:37.577738 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 13:09:38 crc kubenswrapper[4672]: I0930 13:09:38.109770 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 13:09:38 crc kubenswrapper[4672]: I0930 13:09:38.898082 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0","Type":"ContainerStarted","Data":"c435bf929959744e21943834717daaf2e004b66e43b031c9170c7dea9299a218"} Sep 30 13:09:41 crc kubenswrapper[4672]: I0930 13:09:41.949727 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0","Type":"ContainerStarted","Data":"de2f962c083dbcb7320ca63af9e83d7d317efec48473fe9740c0f703a690af58"} Sep 30 13:09:52 crc kubenswrapper[4672]: I0930 13:09:52.073703 4672 generic.go:334] "Generic (PLEG): container finished" podID="5a41d16c-3326-4d4f-a01a-0f8c436aa9b0" containerID="de2f962c083dbcb7320ca63af9e83d7d317efec48473fe9740c0f703a690af58" exitCode=0 Sep 30 13:09:52 crc kubenswrapper[4672]: I0930 13:09:52.073833 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0","Type":"ContainerDied","Data":"de2f962c083dbcb7320ca63af9e83d7d317efec48473fe9740c0f703a690af58"} Sep 30 13:09:53 crc kubenswrapper[4672]: I0930 13:09:53.088745 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0","Type":"ContainerStarted","Data":"78b9db693cbdbf6e509d6bcca76c116937dfce8d35e8f31cc2ed8d6468bbfbd6"} Sep 30 13:09:54 crc kubenswrapper[4672]: I0930 13:09:54.739292 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:09:54 crc kubenswrapper[4672]: I0930 13:09:54.739617 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:09:54 crc kubenswrapper[4672]: I0930 13:09:54.739659 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:09:54 crc kubenswrapper[4672]: I0930 13:09:54.740336 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8b15102ab66fbc60ee39b82c175461202eddaf181c58d28542d8bf751915c67"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:09:54 crc kubenswrapper[4672]: I0930 13:09:54.740396 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://f8b15102ab66fbc60ee39b82c175461202eddaf181c58d28542d8bf751915c67" gracePeriod=600 Sep 30 13:09:55 crc kubenswrapper[4672]: I0930 13:09:55.112186 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="f8b15102ab66fbc60ee39b82c175461202eddaf181c58d28542d8bf751915c67" exitCode=0 Sep 30 13:09:55 crc kubenswrapper[4672]: I0930 13:09:55.112222 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"f8b15102ab66fbc60ee39b82c175461202eddaf181c58d28542d8bf751915c67"} Sep 30 13:09:55 crc kubenswrapper[4672]: I0930 13:09:55.112343 4672 scope.go:117] "RemoveContainer" containerID="8f51e0141ddbaede996117cc2645ec9b4c66d7a7e718e482c491c40151095fbe" Sep 30 13:09:56 crc kubenswrapper[4672]: I0930 13:09:56.130663 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba"} Sep 30 13:09:56 crc kubenswrapper[4672]: I0930 13:09:56.139558 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0","Type":"ContainerStarted","Data":"db46937637d277ae97c3533ec5051e155576dc237e7105c47818af897cb7b21d"} Sep 30 13:09:57 crc kubenswrapper[4672]: I0930 13:09:57.153495 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5a41d16c-3326-4d4f-a01a-0f8c436aa9b0","Type":"ContainerStarted","Data":"3bcaa313d5f7f20aecdd3ba2124403e57a7dcfcda1433c0fe3daf5d16228fe01"} Sep 30 13:09:57 crc kubenswrapper[4672]: I0930 13:09:57.191412 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.191379788 podStartE2EDuration="21.191379788s" podCreationTimestamp="2025-09-30 13:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:09:57.184382929 +0000 UTC m=+2888.453620585" watchObservedRunningTime="2025-09-30 13:09:57.191379788 +0000 UTC m=+2888.460617484" Sep 30 13:09:57 crc kubenswrapper[4672]: I0930 13:09:57.578898 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 13:10:01 crc kubenswrapper[4672]: I0930 13:10:01.746696 4672 scope.go:117] "RemoveContainer" containerID="47aaeacfe5c09a9093d0d2b569d0c92744a18d97edfb1848c3bafa3e6462e1ef" Sep 30 13:10:01 crc kubenswrapper[4672]: I0930 13:10:01.766436 4672 scope.go:117] "RemoveContainer" containerID="86da6325be6a9d6498e87f7cbf2c59e7de3ad5cf48b207dfdc31d82d1b8cf56d" Sep 30 13:10:01 crc kubenswrapper[4672]: I0930 13:10:01.797992 4672 scope.go:117] "RemoveContainer" containerID="1a79e4562c4e332c07785fa3a46d2c31bee965d0a2cf8e35e3549ba2e4379334" Sep 30 13:10:01 crc kubenswrapper[4672]: I0930 13:10:01.862243 4672 scope.go:117] "RemoveContainer" containerID="fbb61f2821a324820bd79c008b10693dcee866e69aff58ca9caf772f6a483ae4" Sep 30 13:10:07 crc kubenswrapper[4672]: I0930 13:10:07.578136 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 13:10:07 crc kubenswrapper[4672]: I0930 13:10:07.586660 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 13:10:08 crc kubenswrapper[4672]: I0930 13:10:08.276766 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.213007 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.215228 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.217478 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.217870 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.218118 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-l6gc7" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.218351 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.224754 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.343985 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.344063 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.344092 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-config-data\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.344167 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwks\" (UniqueName: \"kubernetes.io/projected/42b7a077-06bd-4f39-a1b7-e4692592ae68-kube-api-access-6hwks\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.344230 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.344776 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.345005 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.345050 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.345129 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447181 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447286 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447334 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447466 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447502 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447535 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-config-data\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447664 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwks\" (UniqueName: \"kubernetes.io/projected/42b7a077-06bd-4f39-a1b7-e4692592ae68-kube-api-access-6hwks\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447760 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447840 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.447857 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.448888 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.449097 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-config-data\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.449283 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.449485 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.453622 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.454141 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.454655 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.479592 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwks\" (UniqueName: \"kubernetes.io/projected/42b7a077-06bd-4f39-a1b7-e4692592ae68-kube-api-access-6hwks\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.490159 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " pod="openstack/tempest-tests-tempest" Sep 30 13:10:15 crc kubenswrapper[4672]: I0930 13:10:15.580073 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 13:10:16 crc kubenswrapper[4672]: I0930 13:10:16.015559 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 13:10:16 crc kubenswrapper[4672]: I0930 13:10:16.348340 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"42b7a077-06bd-4f39-a1b7-e4692592ae68","Type":"ContainerStarted","Data":"ff3f30aa39fdbee3a4f0f985bd39ed8017e9e0aaadbdb57d586dbd2b1b99f9ae"} Sep 30 13:10:30 crc kubenswrapper[4672]: I0930 13:10:30.504193 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"42b7a077-06bd-4f39-a1b7-e4692592ae68","Type":"ContainerStarted","Data":"ef890aecfd1d96f3def0725faee9450beb2714b3c6159edf916b6c99964c5054"} Sep 30 13:10:30 crc kubenswrapper[4672]: I0930 13:10:30.530881 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.874013584 podStartE2EDuration="16.530863401s" podCreationTimestamp="2025-09-30 13:10:14 +0000 UTC" firstStartedPulling="2025-09-30 13:10:16.022451777 +0000 UTC m=+2907.291689423" lastFinishedPulling="2025-09-30 13:10:28.679301594 +0000 UTC m=+2919.948539240" observedRunningTime="2025-09-30 13:10:30.525137696 +0000 UTC m=+2921.794375392" watchObservedRunningTime="2025-09-30 13:10:30.530863401 +0000 UTC m=+2921.800101057" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.496941 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pgz7m"] Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.500221 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.509818 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgz7m"] Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.691132 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-catalog-content\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.691206 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhtg\" (UniqueName: \"kubernetes.io/projected/1f1ae903-3cea-49ee-8438-8798f24539f3-kube-api-access-zqhtg\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.691327 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-utilities\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.792765 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-utilities\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.792943 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-catalog-content\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.793020 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhtg\" (UniqueName: \"kubernetes.io/projected/1f1ae903-3cea-49ee-8438-8798f24539f3-kube-api-access-zqhtg\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.793316 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-utilities\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.793437 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-catalog-content\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.812809 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhtg\" (UniqueName: \"kubernetes.io/projected/1f1ae903-3cea-49ee-8438-8798f24539f3-kube-api-access-zqhtg\") pod \"community-operators-pgz7m\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:10:59 crc kubenswrapper[4672]: I0930 13:10:59.822518 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:11:00 crc kubenswrapper[4672]: W0930 13:11:00.423329 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f1ae903_3cea_49ee_8438_8798f24539f3.slice/crio-32b50510b7dc90a9f159bb23b426da73d49db58b869fc86071b9863fdde913b5 WatchSource:0}: Error finding container 32b50510b7dc90a9f159bb23b426da73d49db58b869fc86071b9863fdde913b5: Status 404 returned error can't find the container with id 32b50510b7dc90a9f159bb23b426da73d49db58b869fc86071b9863fdde913b5 Sep 30 13:11:00 crc kubenswrapper[4672]: I0930 13:11:00.424128 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgz7m"] Sep 30 13:11:00 crc kubenswrapper[4672]: I0930 13:11:00.896116 4672 generic.go:334] "Generic (PLEG): container finished" podID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerID="03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d" exitCode=0 Sep 30 13:11:00 crc kubenswrapper[4672]: I0930 13:11:00.896159 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgz7m" event={"ID":"1f1ae903-3cea-49ee-8438-8798f24539f3","Type":"ContainerDied","Data":"03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d"} Sep 30 13:11:00 crc kubenswrapper[4672]: I0930 13:11:00.896184 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgz7m" event={"ID":"1f1ae903-3cea-49ee-8438-8798f24539f3","Type":"ContainerStarted","Data":"32b50510b7dc90a9f159bb23b426da73d49db58b869fc86071b9863fdde913b5"} Sep 30 13:11:02 crc kubenswrapper[4672]: I0930 13:11:02.917109 4672 generic.go:334] "Generic (PLEG): container finished" podID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerID="f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6" exitCode=0 Sep 30 13:11:02 crc kubenswrapper[4672]: I0930 13:11:02.917176 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgz7m" event={"ID":"1f1ae903-3cea-49ee-8438-8798f24539f3","Type":"ContainerDied","Data":"f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6"} Sep 30 13:11:03 crc kubenswrapper[4672]: I0930 13:11:03.932293 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgz7m" event={"ID":"1f1ae903-3cea-49ee-8438-8798f24539f3","Type":"ContainerStarted","Data":"aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a"} Sep 30 13:11:03 crc kubenswrapper[4672]: I0930 13:11:03.953307 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pgz7m" podStartSLOduration=2.43849659 podStartE2EDuration="4.953252663s" podCreationTimestamp="2025-09-30 13:10:59 +0000 UTC" firstStartedPulling="2025-09-30 13:11:00.898960496 +0000 UTC m=+2952.168198152" lastFinishedPulling="2025-09-30 13:11:03.413716579 +0000 UTC m=+2954.682954225" observedRunningTime="2025-09-30 13:11:03.951592971 +0000 UTC m=+2955.220830627" watchObservedRunningTime="2025-09-30 13:11:03.953252663 +0000 UTC m=+2955.222490309" Sep 30 13:11:09 crc kubenswrapper[4672]: I0930 13:11:09.823539 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:11:09 crc kubenswrapper[4672]: I0930 13:11:09.824372 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:11:09 crc kubenswrapper[4672]: I0930 13:11:09.902182 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:11:10 crc kubenswrapper[4672]: I0930 13:11:10.047920 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:11:10 crc kubenswrapper[4672]: I0930 13:11:10.147642 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgz7m"] Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.011318 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pgz7m" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="registry-server" containerID="cri-o://aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a" gracePeriod=2 Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.514135 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.659229 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-utilities\") pod \"1f1ae903-3cea-49ee-8438-8798f24539f3\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.659430 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhtg\" (UniqueName: \"kubernetes.io/projected/1f1ae903-3cea-49ee-8438-8798f24539f3-kube-api-access-zqhtg\") pod \"1f1ae903-3cea-49ee-8438-8798f24539f3\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.660112 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-utilities" (OuterVolumeSpecName: "utilities") pod "1f1ae903-3cea-49ee-8438-8798f24539f3" (UID: "1f1ae903-3cea-49ee-8438-8798f24539f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.660632 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-catalog-content\") pod \"1f1ae903-3cea-49ee-8438-8798f24539f3\" (UID: \"1f1ae903-3cea-49ee-8438-8798f24539f3\") " Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.661151 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.665835 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1ae903-3cea-49ee-8438-8798f24539f3-kube-api-access-zqhtg" (OuterVolumeSpecName: "kube-api-access-zqhtg") pod "1f1ae903-3cea-49ee-8438-8798f24539f3" (UID: "1f1ae903-3cea-49ee-8438-8798f24539f3"). InnerVolumeSpecName "kube-api-access-zqhtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.708941 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f1ae903-3cea-49ee-8438-8798f24539f3" (UID: "1f1ae903-3cea-49ee-8438-8798f24539f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.764244 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhtg\" (UniqueName: \"kubernetes.io/projected/1f1ae903-3cea-49ee-8438-8798f24539f3-kube-api-access-zqhtg\") on node \"crc\" DevicePath \"\"" Sep 30 13:11:12 crc kubenswrapper[4672]: I0930 13:11:12.764299 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1ae903-3cea-49ee-8438-8798f24539f3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.024244 4672 generic.go:334] "Generic (PLEG): container finished" podID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerID="aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a" exitCode=0 Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.024290 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgz7m" event={"ID":"1f1ae903-3cea-49ee-8438-8798f24539f3","Type":"ContainerDied","Data":"aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a"} Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.024339 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgz7m" event={"ID":"1f1ae903-3cea-49ee-8438-8798f24539f3","Type":"ContainerDied","Data":"32b50510b7dc90a9f159bb23b426da73d49db58b869fc86071b9863fdde913b5"} Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.024358 4672 scope.go:117] "RemoveContainer" containerID="aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.024365 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgz7m" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.062469 4672 scope.go:117] "RemoveContainer" containerID="f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.090316 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pgz7m"] Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.101219 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pgz7m"] Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.144895 4672 scope.go:117] "RemoveContainer" containerID="03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.170788 4672 scope.go:117] "RemoveContainer" containerID="aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a" Sep 30 13:11:13 crc kubenswrapper[4672]: E0930 13:11:13.171536 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a\": container with ID starting with aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a not found: ID does not exist" containerID="aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.171683 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a"} err="failed to get container status \"aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a\": rpc error: code = NotFound desc = could not find container \"aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a\": container with ID starting with aeef6233a785366b837d28bf8684e8a23dccae517af4c146d84a16fe5cc81a6a not found: ID does not exist" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.171797 4672 scope.go:117] "RemoveContainer" containerID="f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6" Sep 30 13:11:13 crc kubenswrapper[4672]: E0930 13:11:13.172243 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6\": container with ID starting with f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6 not found: ID does not exist" containerID="f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.172301 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6"} err="failed to get container status \"f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6\": rpc error: code = NotFound desc = could not find container \"f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6\": container with ID starting with f31d483688906933204f1e30bb2f6e1b515f83c36089a615814d74eb2da753c6 not found: ID does not exist" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.172328 4672 scope.go:117] "RemoveContainer" containerID="03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d" Sep 30 13:11:13 crc kubenswrapper[4672]: E0930 13:11:13.172577 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d\": container with ID starting with 03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d not found: ID does not exist" containerID="03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.172620 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d"} err="failed to get container status \"03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d\": rpc error: code = NotFound desc = could not find container \"03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d\": container with ID starting with 03fd8b8a228b51caac5a97d3a579676bed19508922d01a7695e690683720a12d not found: ID does not exist" Sep 30 13:11:13 crc kubenswrapper[4672]: I0930 13:11:13.434421 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" path="/var/lib/kubelet/pods/1f1ae903-3cea-49ee-8438-8798f24539f3/volumes" Sep 30 13:12:24 crc kubenswrapper[4672]: I0930 13:12:24.739602 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:12:24 crc kubenswrapper[4672]: I0930 13:12:24.740138 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:12:54 crc kubenswrapper[4672]: I0930 13:12:54.740138 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:12:54 crc kubenswrapper[4672]: I0930 13:12:54.741526 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:13:24 crc kubenswrapper[4672]: I0930 13:13:24.740106 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:13:24 crc kubenswrapper[4672]: I0930 13:13:24.740811 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:13:24 crc kubenswrapper[4672]: I0930 13:13:24.740872 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:13:24 crc kubenswrapper[4672]: I0930 13:13:24.743108 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:13:24 crc kubenswrapper[4672]: I0930 13:13:24.743196 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" gracePeriod=600 Sep 30 13:13:24 crc kubenswrapper[4672]: E0930 13:13:24.878484 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:13:25 crc kubenswrapper[4672]: I0930 13:13:25.523841 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" exitCode=0 Sep 30 13:13:25 crc kubenswrapper[4672]: I0930 13:13:25.524018 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba"} Sep 30 13:13:25 crc kubenswrapper[4672]: I0930 13:13:25.524148 4672 scope.go:117] "RemoveContainer" containerID="f8b15102ab66fbc60ee39b82c175461202eddaf181c58d28542d8bf751915c67" Sep 30 13:13:25 crc kubenswrapper[4672]: I0930 13:13:25.527872 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:13:25 crc kubenswrapper[4672]: E0930 13:13:25.528522 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:13:39 crc kubenswrapper[4672]: I0930 13:13:39.424920 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:13:39 crc kubenswrapper[4672]: E0930 13:13:39.425745 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:13:54 crc kubenswrapper[4672]: I0930 13:13:54.416942 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:13:54 crc kubenswrapper[4672]: E0930 13:13:54.417830 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:14:08 crc kubenswrapper[4672]: I0930 13:14:08.417356 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:14:08 crc kubenswrapper[4672]: E0930 13:14:08.418313 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:14:20 crc kubenswrapper[4672]: I0930 13:14:20.417450 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:14:20 crc kubenswrapper[4672]: E0930 13:14:20.418470 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:14:33 crc kubenswrapper[4672]: I0930 13:14:33.417852 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:14:33 crc kubenswrapper[4672]: E0930 13:14:33.418920 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:14:46 crc kubenswrapper[4672]: I0930 13:14:46.417512 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:14:46 crc kubenswrapper[4672]: E0930 13:14:46.418218 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:14:57 crc kubenswrapper[4672]: I0930 13:14:57.417979 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:14:57 crc kubenswrapper[4672]: E0930 13:14:57.418729 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.199133 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc"] Sep 30 13:15:00 crc kubenswrapper[4672]: E0930 13:15:00.199890 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="registry-server" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.199903 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="registry-server" Sep 30 13:15:00 crc kubenswrapper[4672]: E0930 13:15:00.199921 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="extract-content" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.199927 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="extract-content" Sep 30 13:15:00 crc kubenswrapper[4672]: E0930 13:15:00.199962 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="extract-utilities" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.199969 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="extract-utilities" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.200163 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1ae903-3cea-49ee-8438-8798f24539f3" containerName="registry-server" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.200851 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.203139 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.203137 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.209813 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc"] Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.243745 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xcw\" (UniqueName: \"kubernetes.io/projected/ca3926b3-5caf-4f28-90c4-6f0f00210b19-kube-api-access-j2xcw\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.244056 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca3926b3-5caf-4f28-90c4-6f0f00210b19-config-volume\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.244199 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca3926b3-5caf-4f28-90c4-6f0f00210b19-secret-volume\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.346669 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xcw\" (UniqueName: \"kubernetes.io/projected/ca3926b3-5caf-4f28-90c4-6f0f00210b19-kube-api-access-j2xcw\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.346808 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca3926b3-5caf-4f28-90c4-6f0f00210b19-config-volume\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.346888 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca3926b3-5caf-4f28-90c4-6f0f00210b19-secret-volume\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.348081 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca3926b3-5caf-4f28-90c4-6f0f00210b19-config-volume\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.361001 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca3926b3-5caf-4f28-90c4-6f0f00210b19-secret-volume\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.364155 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xcw\" (UniqueName: \"kubernetes.io/projected/ca3926b3-5caf-4f28-90c4-6f0f00210b19-kube-api-access-j2xcw\") pod \"collect-profiles-29320635-2hctc\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:00 crc kubenswrapper[4672]: I0930 13:15:00.523980 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:01 crc kubenswrapper[4672]: I0930 13:15:01.046901 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc"] Sep 30 13:15:01 crc kubenswrapper[4672]: I0930 13:15:01.547680 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" event={"ID":"ca3926b3-5caf-4f28-90c4-6f0f00210b19","Type":"ContainerStarted","Data":"9dad776a5c2f5b8a01afd8d87ed1e1d9d22f35d0eb853ff6adb4c92b7541abcd"} Sep 30 13:15:01 crc kubenswrapper[4672]: I0930 13:15:01.547972 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" event={"ID":"ca3926b3-5caf-4f28-90c4-6f0f00210b19","Type":"ContainerStarted","Data":"916b9c6746499e5fab6599eb835428921d18f3639cb3557c86c80d96b3a78f28"} Sep 30 13:15:01 crc kubenswrapper[4672]: I0930 13:15:01.563693 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" podStartSLOduration=1.563671975 podStartE2EDuration="1.563671975s" podCreationTimestamp="2025-09-30 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:15:01.559682544 +0000 UTC m=+3192.828920190" watchObservedRunningTime="2025-09-30 13:15:01.563671975 +0000 UTC m=+3192.832909621" Sep 30 13:15:02 crc kubenswrapper[4672]: I0930 13:15:02.561959 4672 generic.go:334] "Generic (PLEG): container finished" podID="ca3926b3-5caf-4f28-90c4-6f0f00210b19" containerID="9dad776a5c2f5b8a01afd8d87ed1e1d9d22f35d0eb853ff6adb4c92b7541abcd" exitCode=0 Sep 30 13:15:02 crc kubenswrapper[4672]: I0930 13:15:02.562020 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" event={"ID":"ca3926b3-5caf-4f28-90c4-6f0f00210b19","Type":"ContainerDied","Data":"9dad776a5c2f5b8a01afd8d87ed1e1d9d22f35d0eb853ff6adb4c92b7541abcd"} Sep 30 13:15:03 crc kubenswrapper[4672]: I0930 13:15:03.956174 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.055829 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca3926b3-5caf-4f28-90c4-6f0f00210b19-secret-volume\") pod \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.055923 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca3926b3-5caf-4f28-90c4-6f0f00210b19-config-volume\") pod \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.056036 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2xcw\" (UniqueName: \"kubernetes.io/projected/ca3926b3-5caf-4f28-90c4-6f0f00210b19-kube-api-access-j2xcw\") pod \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\" (UID: \"ca3926b3-5caf-4f28-90c4-6f0f00210b19\") " Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.056679 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3926b3-5caf-4f28-90c4-6f0f00210b19-config-volume" (OuterVolumeSpecName: "config-volume") pod "ca3926b3-5caf-4f28-90c4-6f0f00210b19" (UID: "ca3926b3-5caf-4f28-90c4-6f0f00210b19"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.061915 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3926b3-5caf-4f28-90c4-6f0f00210b19-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ca3926b3-5caf-4f28-90c4-6f0f00210b19" (UID: "ca3926b3-5caf-4f28-90c4-6f0f00210b19"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.062069 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3926b3-5caf-4f28-90c4-6f0f00210b19-kube-api-access-j2xcw" (OuterVolumeSpecName: "kube-api-access-j2xcw") pod "ca3926b3-5caf-4f28-90c4-6f0f00210b19" (UID: "ca3926b3-5caf-4f28-90c4-6f0f00210b19"). InnerVolumeSpecName "kube-api-access-j2xcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.158058 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ca3926b3-5caf-4f28-90c4-6f0f00210b19-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.158095 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca3926b3-5caf-4f28-90c4-6f0f00210b19-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.158105 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2xcw\" (UniqueName: \"kubernetes.io/projected/ca3926b3-5caf-4f28-90c4-6f0f00210b19-kube-api-access-j2xcw\") on node \"crc\" DevicePath \"\"" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.586044 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" event={"ID":"ca3926b3-5caf-4f28-90c4-6f0f00210b19","Type":"ContainerDied","Data":"916b9c6746499e5fab6599eb835428921d18f3639cb3557c86c80d96b3a78f28"} Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.586405 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916b9c6746499e5fab6599eb835428921d18f3639cb3557c86c80d96b3a78f28" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.586314 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc" Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.645221 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm"] Sep 30 13:15:04 crc kubenswrapper[4672]: I0930 13:15:04.657324 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320590-97krm"] Sep 30 13:15:05 crc kubenswrapper[4672]: I0930 13:15:05.434403 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab57cdf-bd45-48f0-97e9-e6cad9bb6554" path="/var/lib/kubelet/pods/1ab57cdf-bd45-48f0-97e9-e6cad9bb6554/volumes" Sep 30 13:15:11 crc kubenswrapper[4672]: I0930 13:15:11.418120 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:15:11 crc kubenswrapper[4672]: E0930 13:15:11.419216 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:15:24 crc kubenswrapper[4672]: I0930 13:15:24.418093 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:15:24 crc kubenswrapper[4672]: E0930 13:15:24.419164 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:15:38 crc kubenswrapper[4672]: I0930 13:15:38.417289 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:15:38 crc kubenswrapper[4672]: E0930 13:15:38.418287 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:15:52 crc kubenswrapper[4672]: I0930 13:15:52.417319 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:15:52 crc kubenswrapper[4672]: E0930 13:15:52.418426 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:16:02 crc kubenswrapper[4672]: I0930 13:16:02.108482 4672 scope.go:117] "RemoveContainer" containerID="a8e7b7f94558a2836baa398db8c9e6afc179de0765c52d96e5a6e7a0d16e67d5" Sep 30 13:16:05 crc kubenswrapper[4672]: I0930 13:16:05.417196 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:16:05 crc kubenswrapper[4672]: E0930 13:16:05.418352 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:16:18 crc kubenswrapper[4672]: I0930 13:16:18.417417 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:16:18 crc kubenswrapper[4672]: E0930 13:16:18.418698 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:16:32 crc kubenswrapper[4672]: I0930 13:16:32.417501 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:16:32 crc kubenswrapper[4672]: E0930 13:16:32.418443 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:16:45 crc kubenswrapper[4672]: I0930 13:16:45.416885 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:16:45 crc kubenswrapper[4672]: E0930 13:16:45.417738 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:16:57 crc kubenswrapper[4672]: I0930 13:16:57.417348 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:16:57 crc kubenswrapper[4672]: E0930 13:16:57.418434 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:17:09 crc kubenswrapper[4672]: I0930 13:17:09.424811 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:17:09 crc kubenswrapper[4672]: E0930 13:17:09.425915 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:17:21 crc kubenswrapper[4672]: I0930 13:17:21.417047 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:17:21 crc kubenswrapper[4672]: E0930 13:17:21.418425 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:17:32 crc kubenswrapper[4672]: I0930 13:17:32.418017 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:17:32 crc kubenswrapper[4672]: E0930 13:17:32.419094 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:17:46 crc kubenswrapper[4672]: I0930 13:17:46.417695 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:17:46 crc kubenswrapper[4672]: E0930 13:17:46.418658 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.327391 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5d9n6"] Sep 30 13:17:50 crc kubenswrapper[4672]: E0930 13:17:50.328860 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3926b3-5caf-4f28-90c4-6f0f00210b19" containerName="collect-profiles" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.328880 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3926b3-5caf-4f28-90c4-6f0f00210b19" containerName="collect-profiles" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.329127 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3926b3-5caf-4f28-90c4-6f0f00210b19" containerName="collect-profiles" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.331032 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.364178 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5d9n6"] Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.483548 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgbxq\" (UniqueName: \"kubernetes.io/projected/9b87efb8-9e79-41f1-98f3-2d1d464cd758-kube-api-access-sgbxq\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.483661 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-utilities\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.483706 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-catalog-content\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.585208 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgbxq\" (UniqueName: \"kubernetes.io/projected/9b87efb8-9e79-41f1-98f3-2d1d464cd758-kube-api-access-sgbxq\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.585320 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-utilities\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.585356 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-catalog-content\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.585788 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-catalog-content\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.585976 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-utilities\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.624281 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgbxq\" (UniqueName: \"kubernetes.io/projected/9b87efb8-9e79-41f1-98f3-2d1d464cd758-kube-api-access-sgbxq\") pod \"certified-operators-5d9n6\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:50 crc kubenswrapper[4672]: I0930 13:17:50.660801 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:17:51 crc kubenswrapper[4672]: I0930 13:17:51.165701 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5d9n6"] Sep 30 13:17:51 crc kubenswrapper[4672]: I0930 13:17:51.363566 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5d9n6" event={"ID":"9b87efb8-9e79-41f1-98f3-2d1d464cd758","Type":"ContainerStarted","Data":"4fa395f88dc4ad87c1eb7d0cfbd6a2294f581202ded8a6179b39282be4600b83"} Sep 30 13:17:52 crc kubenswrapper[4672]: I0930 13:17:52.377418 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerID="f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804" exitCode=0 Sep 30 13:17:52 crc kubenswrapper[4672]: I0930 13:17:52.377489 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5d9n6" event={"ID":"9b87efb8-9e79-41f1-98f3-2d1d464cd758","Type":"ContainerDied","Data":"f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804"} Sep 30 13:17:52 crc kubenswrapper[4672]: I0930 13:17:52.382763 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:17:54 crc kubenswrapper[4672]: I0930 13:17:54.408476 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerID="1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e" exitCode=0 Sep 30 13:17:54 crc kubenswrapper[4672]: I0930 13:17:54.408606 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5d9n6" event={"ID":"9b87efb8-9e79-41f1-98f3-2d1d464cd758","Type":"ContainerDied","Data":"1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e"} Sep 30 13:17:55 crc kubenswrapper[4672]: I0930 13:17:55.433757 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5d9n6" event={"ID":"9b87efb8-9e79-41f1-98f3-2d1d464cd758","Type":"ContainerStarted","Data":"5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52"} Sep 30 13:17:55 crc kubenswrapper[4672]: I0930 13:17:55.443685 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5d9n6" podStartSLOduration=2.989886621 podStartE2EDuration="5.443660964s" podCreationTimestamp="2025-09-30 13:17:50 +0000 UTC" firstStartedPulling="2025-09-30 13:17:52.382446532 +0000 UTC m=+3363.651684178" lastFinishedPulling="2025-09-30 13:17:54.836220835 +0000 UTC m=+3366.105458521" observedRunningTime="2025-09-30 13:17:55.442987056 +0000 UTC m=+3366.712224702" watchObservedRunningTime="2025-09-30 13:17:55.443660964 +0000 UTC m=+3366.712898620" Sep 30 13:18:00 crc kubenswrapper[4672]: I0930 13:18:00.661574 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:18:00 crc kubenswrapper[4672]: I0930 13:18:00.662209 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:18:00 crc kubenswrapper[4672]: I0930 13:18:00.726567 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:18:01 crc kubenswrapper[4672]: I0930 13:18:01.417408 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:18:01 crc kubenswrapper[4672]: E0930 13:18:01.418471 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:18:01 crc kubenswrapper[4672]: I0930 13:18:01.540216 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:18:01 crc kubenswrapper[4672]: I0930 13:18:01.598461 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5d9n6"] Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.382057 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-469pv"] Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.385126 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.405106 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-469pv"] Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.459023 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-utilities\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.459096 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-catalog-content\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.459449 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmlx\" (UniqueName: \"kubernetes.io/projected/4f83d063-1c3e-4db8-aaab-2a159cf19e70-kube-api-access-wvmlx\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.507277 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5d9n6" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="registry-server" containerID="cri-o://5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52" gracePeriod=2 Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.561911 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmlx\" (UniqueName: \"kubernetes.io/projected/4f83d063-1c3e-4db8-aaab-2a159cf19e70-kube-api-access-wvmlx\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.562091 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-utilities\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.562130 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-catalog-content\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.562627 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-catalog-content\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.562733 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-utilities\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.584113 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmlx\" (UniqueName: \"kubernetes.io/projected/4f83d063-1c3e-4db8-aaab-2a159cf19e70-kube-api-access-wvmlx\") pod \"redhat-marketplace-469pv\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:03 crc kubenswrapper[4672]: I0930 13:18:03.728239 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.031534 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.176068 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgbxq\" (UniqueName: \"kubernetes.io/projected/9b87efb8-9e79-41f1-98f3-2d1d464cd758-kube-api-access-sgbxq\") pod \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.176155 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-utilities\") pod \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.176227 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-catalog-content\") pod \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\" (UID: \"9b87efb8-9e79-41f1-98f3-2d1d464cd758\") " Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.177022 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-utilities" (OuterVolumeSpecName: "utilities") pod "9b87efb8-9e79-41f1-98f3-2d1d464cd758" (UID: "9b87efb8-9e79-41f1-98f3-2d1d464cd758"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.185579 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b87efb8-9e79-41f1-98f3-2d1d464cd758-kube-api-access-sgbxq" (OuterVolumeSpecName: "kube-api-access-sgbxq") pod "9b87efb8-9e79-41f1-98f3-2d1d464cd758" (UID: "9b87efb8-9e79-41f1-98f3-2d1d464cd758"). InnerVolumeSpecName "kube-api-access-sgbxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.222970 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b87efb8-9e79-41f1-98f3-2d1d464cd758" (UID: "9b87efb8-9e79-41f1-98f3-2d1d464cd758"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.255101 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-469pv"] Sep 30 13:18:04 crc kubenswrapper[4672]: W0930 13:18:04.255104 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f83d063_1c3e_4db8_aaab_2a159cf19e70.slice/crio-99dad93260007b604b810ff25fb2a2967d15353e46f212bc52e268052f4d8b84 WatchSource:0}: Error finding container 99dad93260007b604b810ff25fb2a2967d15353e46f212bc52e268052f4d8b84: Status 404 returned error can't find the container with id 99dad93260007b604b810ff25fb2a2967d15353e46f212bc52e268052f4d8b84 Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.278917 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgbxq\" (UniqueName: \"kubernetes.io/projected/9b87efb8-9e79-41f1-98f3-2d1d464cd758-kube-api-access-sgbxq\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.278955 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.278967 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b87efb8-9e79-41f1-98f3-2d1d464cd758-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.517731 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerID="5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52" exitCode=0 Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.517809 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5d9n6" event={"ID":"9b87efb8-9e79-41f1-98f3-2d1d464cd758","Type":"ContainerDied","Data":"5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52"} Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.517808 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5d9n6" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.517846 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5d9n6" event={"ID":"9b87efb8-9e79-41f1-98f3-2d1d464cd758","Type":"ContainerDied","Data":"4fa395f88dc4ad87c1eb7d0cfbd6a2294f581202ded8a6179b39282be4600b83"} Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.517868 4672 scope.go:117] "RemoveContainer" containerID="5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.519710 4672 generic.go:334] "Generic (PLEG): container finished" podID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerID="5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3" exitCode=0 Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.519741 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-469pv" event={"ID":"4f83d063-1c3e-4db8-aaab-2a159cf19e70","Type":"ContainerDied","Data":"5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3"} Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.519763 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-469pv" event={"ID":"4f83d063-1c3e-4db8-aaab-2a159cf19e70","Type":"ContainerStarted","Data":"99dad93260007b604b810ff25fb2a2967d15353e46f212bc52e268052f4d8b84"} Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.545965 4672 scope.go:117] "RemoveContainer" containerID="1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.562632 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5d9n6"] Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.571449 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5d9n6"] Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.586234 4672 scope.go:117] "RemoveContainer" containerID="f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.603843 4672 scope.go:117] "RemoveContainer" containerID="5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52" Sep 30 13:18:04 crc kubenswrapper[4672]: E0930 13:18:04.604297 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52\": container with ID starting with 5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52 not found: ID does not exist" containerID="5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.604347 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52"} err="failed to get container status \"5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52\": rpc error: code = NotFound desc = could not find container \"5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52\": container with ID starting with 5b619834662317eae8220e64fc9dd137ef4aa21b67900c08f1d25bc8c34c6f52 not found: ID does not exist" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.604374 4672 scope.go:117] "RemoveContainer" containerID="1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e" Sep 30 13:18:04 crc kubenswrapper[4672]: E0930 13:18:04.604770 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e\": container with ID starting with 1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e not found: ID does not exist" containerID="1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.604791 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e"} err="failed to get container status \"1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e\": rpc error: code = NotFound desc = could not find container \"1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e\": container with ID starting with 1d927bf714aab2ab6e49f67defc6c292419d3db8136165b1bdc509ede8e43d0e not found: ID does not exist" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.604805 4672 scope.go:117] "RemoveContainer" containerID="f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804" Sep 30 13:18:04 crc kubenswrapper[4672]: E0930 13:18:04.605043 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804\": container with ID starting with f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804 not found: ID does not exist" containerID="f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804" Sep 30 13:18:04 crc kubenswrapper[4672]: I0930 13:18:04.605104 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804"} err="failed to get container status \"f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804\": rpc error: code = NotFound desc = could not find container \"f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804\": container with ID starting with f4b1fb1a4c7888d1f4dc11fb3074b8fb1f7f0c6a079bb6622e7ec284297d0804 not found: ID does not exist" Sep 30 13:18:05 crc kubenswrapper[4672]: I0930 13:18:05.431802 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" path="/var/lib/kubelet/pods/9b87efb8-9e79-41f1-98f3-2d1d464cd758/volumes" Sep 30 13:18:07 crc kubenswrapper[4672]: I0930 13:18:07.565720 4672 generic.go:334] "Generic (PLEG): container finished" podID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerID="0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b" exitCode=0 Sep 30 13:18:07 crc kubenswrapper[4672]: I0930 13:18:07.565808 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-469pv" event={"ID":"4f83d063-1c3e-4db8-aaab-2a159cf19e70","Type":"ContainerDied","Data":"0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b"} Sep 30 13:18:08 crc kubenswrapper[4672]: I0930 13:18:08.579438 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-469pv" event={"ID":"4f83d063-1c3e-4db8-aaab-2a159cf19e70","Type":"ContainerStarted","Data":"a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3"} Sep 30 13:18:08 crc kubenswrapper[4672]: I0930 13:18:08.615552 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-469pv" podStartSLOduration=1.971907593 podStartE2EDuration="5.615529958s" podCreationTimestamp="2025-09-30 13:18:03 +0000 UTC" firstStartedPulling="2025-09-30 13:18:04.521355989 +0000 UTC m=+3375.790593655" lastFinishedPulling="2025-09-30 13:18:08.164978374 +0000 UTC m=+3379.434216020" observedRunningTime="2025-09-30 13:18:08.603643016 +0000 UTC m=+3379.872880662" watchObservedRunningTime="2025-09-30 13:18:08.615529958 +0000 UTC m=+3379.884767604" Sep 30 13:18:12 crc kubenswrapper[4672]: I0930 13:18:12.417865 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:18:12 crc kubenswrapper[4672]: E0930 13:18:12.419169 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:18:13 crc kubenswrapper[4672]: I0930 13:18:13.728787 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:13 crc kubenswrapper[4672]: I0930 13:18:13.729204 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:13 crc kubenswrapper[4672]: I0930 13:18:13.814295 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:14 crc kubenswrapper[4672]: I0930 13:18:14.721483 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:14 crc kubenswrapper[4672]: I0930 13:18:14.792678 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-469pv"] Sep 30 13:18:16 crc kubenswrapper[4672]: I0930 13:18:16.659109 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-469pv" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="registry-server" containerID="cri-o://a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3" gracePeriod=2 Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.134405 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.254351 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-utilities\") pod \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.254463 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-catalog-content\") pod \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.254490 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvmlx\" (UniqueName: \"kubernetes.io/projected/4f83d063-1c3e-4db8-aaab-2a159cf19e70-kube-api-access-wvmlx\") pod \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\" (UID: \"4f83d063-1c3e-4db8-aaab-2a159cf19e70\") " Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.255329 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-utilities" (OuterVolumeSpecName: "utilities") pod "4f83d063-1c3e-4db8-aaab-2a159cf19e70" (UID: "4f83d063-1c3e-4db8-aaab-2a159cf19e70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.259795 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f83d063-1c3e-4db8-aaab-2a159cf19e70-kube-api-access-wvmlx" (OuterVolumeSpecName: "kube-api-access-wvmlx") pod "4f83d063-1c3e-4db8-aaab-2a159cf19e70" (UID: "4f83d063-1c3e-4db8-aaab-2a159cf19e70"). InnerVolumeSpecName "kube-api-access-wvmlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.269028 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f83d063-1c3e-4db8-aaab-2a159cf19e70" (UID: "4f83d063-1c3e-4db8-aaab-2a159cf19e70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.356524 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.356558 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f83d063-1c3e-4db8-aaab-2a159cf19e70-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.356571 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvmlx\" (UniqueName: \"kubernetes.io/projected/4f83d063-1c3e-4db8-aaab-2a159cf19e70-kube-api-access-wvmlx\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.662106 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vsv6v"] Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.662951 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="extract-utilities" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.662969 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="extract-utilities" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.662998 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="extract-utilities" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.663006 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="extract-utilities" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.663023 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="extract-content" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.663031 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="extract-content" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.663059 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="registry-server" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.663067 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="registry-server" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.663082 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="registry-server" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.663090 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="registry-server" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.663109 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="extract-content" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.663117 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="extract-content" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.663361 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerName="registry-server" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.663373 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b87efb8-9e79-41f1-98f3-2d1d464cd758" containerName="registry-server" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.665128 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.670631 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-catalog-content\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.670728 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-utilities\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.670772 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmfgp\" (UniqueName: \"kubernetes.io/projected/d92501c4-c764-4460-9d79-533412011446-kube-api-access-jmfgp\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.678407 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsv6v"] Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.692128 4672 generic.go:334] "Generic (PLEG): container finished" podID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" containerID="a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3" exitCode=0 Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.692169 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-469pv" event={"ID":"4f83d063-1c3e-4db8-aaab-2a159cf19e70","Type":"ContainerDied","Data":"a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3"} Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.692194 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-469pv" event={"ID":"4f83d063-1c3e-4db8-aaab-2a159cf19e70","Type":"ContainerDied","Data":"99dad93260007b604b810ff25fb2a2967d15353e46f212bc52e268052f4d8b84"} Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.692210 4672 scope.go:117] "RemoveContainer" containerID="a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.692366 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-469pv" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.739030 4672 scope.go:117] "RemoveContainer" containerID="0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.743245 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-469pv"] Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.751831 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-469pv"] Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.767312 4672 scope.go:117] "RemoveContainer" containerID="5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.783077 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmfgp\" (UniqueName: \"kubernetes.io/projected/d92501c4-c764-4460-9d79-533412011446-kube-api-access-jmfgp\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.783744 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-catalog-content\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.784050 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-utilities\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.784947 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-utilities\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.785630 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-catalog-content\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.815623 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmfgp\" (UniqueName: \"kubernetes.io/projected/d92501c4-c764-4460-9d79-533412011446-kube-api-access-jmfgp\") pod \"redhat-operators-vsv6v\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.897733 4672 scope.go:117] "RemoveContainer" containerID="a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.899216 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3\": container with ID starting with a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3 not found: ID does not exist" containerID="a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.899288 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3"} err="failed to get container status \"a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3\": rpc error: code = NotFound desc = could not find container \"a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3\": container with ID starting with a702e79d3a268ae989f08394f90f8ba2b1fe4acd1c53bd87939601b8f589c8a3 not found: ID does not exist" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.899322 4672 scope.go:117] "RemoveContainer" containerID="0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.899780 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b\": container with ID starting with 0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b not found: ID does not exist" containerID="0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.899847 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b"} err="failed to get container status \"0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b\": rpc error: code = NotFound desc = could not find container \"0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b\": container with ID starting with 0b5d2dcf8f4e60b23ee096f41cd70ff5d1ff2a64a440043930c699ede2a5166b not found: ID does not exist" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.899886 4672 scope.go:117] "RemoveContainer" containerID="5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3" Sep 30 13:18:17 crc kubenswrapper[4672]: E0930 13:18:17.900423 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3\": container with ID starting with 5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3 not found: ID does not exist" containerID="5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.900486 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3"} err="failed to get container status \"5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3\": rpc error: code = NotFound desc = could not find container \"5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3\": container with ID starting with 5271131dadff54eca6ac6e6046466b679e6c0a4a4ee1440263c5706e753f69c3 not found: ID does not exist" Sep 30 13:18:17 crc kubenswrapper[4672]: I0930 13:18:17.998070 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:18 crc kubenswrapper[4672]: I0930 13:18:18.466081 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsv6v"] Sep 30 13:18:18 crc kubenswrapper[4672]: I0930 13:18:18.702927 4672 generic.go:334] "Generic (PLEG): container finished" podID="d92501c4-c764-4460-9d79-533412011446" containerID="a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e" exitCode=0 Sep 30 13:18:18 crc kubenswrapper[4672]: I0930 13:18:18.703046 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsv6v" event={"ID":"d92501c4-c764-4460-9d79-533412011446","Type":"ContainerDied","Data":"a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e"} Sep 30 13:18:18 crc kubenswrapper[4672]: I0930 13:18:18.703289 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsv6v" event={"ID":"d92501c4-c764-4460-9d79-533412011446","Type":"ContainerStarted","Data":"3caff2f66115fc1d72611b60b66282dd223375a3b70a12fe4b090c879ede9d5e"} Sep 30 13:18:19 crc kubenswrapper[4672]: I0930 13:18:19.434847 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f83d063-1c3e-4db8-aaab-2a159cf19e70" path="/var/lib/kubelet/pods/4f83d063-1c3e-4db8-aaab-2a159cf19e70/volumes" Sep 30 13:18:19 crc kubenswrapper[4672]: I0930 13:18:19.722868 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsv6v" event={"ID":"d92501c4-c764-4460-9d79-533412011446","Type":"ContainerStarted","Data":"8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2"} Sep 30 13:18:21 crc kubenswrapper[4672]: I0930 13:18:21.756590 4672 generic.go:334] "Generic (PLEG): container finished" podID="d92501c4-c764-4460-9d79-533412011446" containerID="8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2" exitCode=0 Sep 30 13:18:21 crc kubenswrapper[4672]: I0930 13:18:21.756621 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsv6v" event={"ID":"d92501c4-c764-4460-9d79-533412011446","Type":"ContainerDied","Data":"8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2"} Sep 30 13:18:22 crc kubenswrapper[4672]: I0930 13:18:22.768867 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsv6v" event={"ID":"d92501c4-c764-4460-9d79-533412011446","Type":"ContainerStarted","Data":"c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15"} Sep 30 13:18:22 crc kubenswrapper[4672]: I0930 13:18:22.794029 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vsv6v" podStartSLOduration=2.235925676 podStartE2EDuration="5.794011689s" podCreationTimestamp="2025-09-30 13:18:17 +0000 UTC" firstStartedPulling="2025-09-30 13:18:18.70454461 +0000 UTC m=+3389.973782246" lastFinishedPulling="2025-09-30 13:18:22.262630593 +0000 UTC m=+3393.531868259" observedRunningTime="2025-09-30 13:18:22.789406472 +0000 UTC m=+3394.058644118" watchObservedRunningTime="2025-09-30 13:18:22.794011689 +0000 UTC m=+3394.063249335" Sep 30 13:18:26 crc kubenswrapper[4672]: I0930 13:18:26.417940 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:18:26 crc kubenswrapper[4672]: I0930 13:18:26.816078 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"6b2b209d5c83cc2f1bb5c85320d1859fc6933d14d7e8096d1dfad91648be07d1"} Sep 30 13:18:28 crc kubenswrapper[4672]: I0930 13:18:27.999299 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:28 crc kubenswrapper[4672]: I0930 13:18:27.999344 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:28 crc kubenswrapper[4672]: I0930 13:18:28.090040 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:28 crc kubenswrapper[4672]: I0930 13:18:28.883642 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:28 crc kubenswrapper[4672]: I0930 13:18:28.951943 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsv6v"] Sep 30 13:18:30 crc kubenswrapper[4672]: I0930 13:18:30.860979 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vsv6v" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="registry-server" containerID="cri-o://c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15" gracePeriod=2 Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.423721 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.466452 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-catalog-content\") pod \"d92501c4-c764-4460-9d79-533412011446\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.466572 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-utilities\") pod \"d92501c4-c764-4460-9d79-533412011446\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.466853 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmfgp\" (UniqueName: \"kubernetes.io/projected/d92501c4-c764-4460-9d79-533412011446-kube-api-access-jmfgp\") pod \"d92501c4-c764-4460-9d79-533412011446\" (UID: \"d92501c4-c764-4460-9d79-533412011446\") " Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.468196 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-utilities" (OuterVolumeSpecName: "utilities") pod "d92501c4-c764-4460-9d79-533412011446" (UID: "d92501c4-c764-4460-9d79-533412011446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.475207 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92501c4-c764-4460-9d79-533412011446-kube-api-access-jmfgp" (OuterVolumeSpecName: "kube-api-access-jmfgp") pod "d92501c4-c764-4460-9d79-533412011446" (UID: "d92501c4-c764-4460-9d79-533412011446"). InnerVolumeSpecName "kube-api-access-jmfgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.542552 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d92501c4-c764-4460-9d79-533412011446" (UID: "d92501c4-c764-4460-9d79-533412011446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.568916 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.568958 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmfgp\" (UniqueName: \"kubernetes.io/projected/d92501c4-c764-4460-9d79-533412011446-kube-api-access-jmfgp\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.568968 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92501c4-c764-4460-9d79-533412011446-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.871882 4672 generic.go:334] "Generic (PLEG): container finished" podID="d92501c4-c764-4460-9d79-533412011446" containerID="c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15" exitCode=0 Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.871988 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsv6v" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.872719 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsv6v" event={"ID":"d92501c4-c764-4460-9d79-533412011446","Type":"ContainerDied","Data":"c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15"} Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.872778 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsv6v" event={"ID":"d92501c4-c764-4460-9d79-533412011446","Type":"ContainerDied","Data":"3caff2f66115fc1d72611b60b66282dd223375a3b70a12fe4b090c879ede9d5e"} Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.872797 4672 scope.go:117] "RemoveContainer" containerID="c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.904492 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsv6v"] Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.911403 4672 scope.go:117] "RemoveContainer" containerID="8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.914012 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vsv6v"] Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.933746 4672 scope.go:117] "RemoveContainer" containerID="a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.977745 4672 scope.go:117] "RemoveContainer" containerID="c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15" Sep 30 13:18:31 crc kubenswrapper[4672]: E0930 13:18:31.978249 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15\": container with ID starting with c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15 not found: ID does not exist" containerID="c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.978426 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15"} err="failed to get container status \"c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15\": rpc error: code = NotFound desc = could not find container \"c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15\": container with ID starting with c275fc061ac7b3394984f7a233aef39cd986f80dec1b1bcd197bea4bd658ff15 not found: ID does not exist" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.978552 4672 scope.go:117] "RemoveContainer" containerID="8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2" Sep 30 13:18:31 crc kubenswrapper[4672]: E0930 13:18:31.979089 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2\": container with ID starting with 8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2 not found: ID does not exist" containerID="8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.979138 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2"} err="failed to get container status \"8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2\": rpc error: code = NotFound desc = could not find container \"8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2\": container with ID starting with 8d9123bd4934ea45cba861e5fc938e1ee34dfb3fb3554b60ecc317a29d76b1a2 not found: ID does not exist" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.979175 4672 scope.go:117] "RemoveContainer" containerID="a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e" Sep 30 13:18:31 crc kubenswrapper[4672]: E0930 13:18:31.979706 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e\": container with ID starting with a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e not found: ID does not exist" containerID="a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e" Sep 30 13:18:31 crc kubenswrapper[4672]: I0930 13:18:31.979738 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e"} err="failed to get container status \"a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e\": rpc error: code = NotFound desc = could not find container \"a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e\": container with ID starting with a6d7be69ad8123b981291e35ba71b88356f2dd6613b6ec97c8d9e03c9f10e52e not found: ID does not exist" Sep 30 13:18:33 crc kubenswrapper[4672]: I0930 13:18:33.437872 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92501c4-c764-4460-9d79-533412011446" path="/var/lib/kubelet/pods/d92501c4-c764-4460-9d79-533412011446/volumes" Sep 30 13:20:54 crc kubenswrapper[4672]: I0930 13:20:54.739772 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:20:54 crc kubenswrapper[4672]: I0930 13:20:54.740619 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:21:24 crc kubenswrapper[4672]: E0930 13:21:24.569666 4672 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.241:38248->38.102.83.241:40853: write tcp 38.102.83.241:38248->38.102.83.241:40853: write: connection reset by peer Sep 30 13:21:24 crc kubenswrapper[4672]: I0930 13:21:24.739474 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:21:24 crc kubenswrapper[4672]: I0930 13:21:24.740127 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.739925 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.740373 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.740421 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.741173 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b2b209d5c83cc2f1bb5c85320d1859fc6933d14d7e8096d1dfad91648be07d1"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.741219 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://6b2b209d5c83cc2f1bb5c85320d1859fc6933d14d7e8096d1dfad91648be07d1" gracePeriod=600 Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.913119 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="6b2b209d5c83cc2f1bb5c85320d1859fc6933d14d7e8096d1dfad91648be07d1" exitCode=0 Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.913192 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"6b2b209d5c83cc2f1bb5c85320d1859fc6933d14d7e8096d1dfad91648be07d1"} Sep 30 13:21:54 crc kubenswrapper[4672]: I0930 13:21:54.913445 4672 scope.go:117] "RemoveContainer" containerID="04c1596c42881000cc14ffb9f09c27a0764c7f36327532142c7052e23cf117ba" Sep 30 13:21:55 crc kubenswrapper[4672]: I0930 13:21:55.924050 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2"} Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.446856 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-786wl"] Sep 30 13:22:53 crc kubenswrapper[4672]: E0930 13:22:53.448216 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="extract-content" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.448242 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="extract-content" Sep 30 13:22:53 crc kubenswrapper[4672]: E0930 13:22:53.448434 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="registry-server" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.448483 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="registry-server" Sep 30 13:22:53 crc kubenswrapper[4672]: E0930 13:22:53.448751 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="extract-utilities" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.448777 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="extract-utilities" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.450000 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92501c4-c764-4460-9d79-533412011446" containerName="registry-server" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.455348 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-786wl"] Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.455496 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.577936 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-catalog-content\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.578150 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgbww\" (UniqueName: \"kubernetes.io/projected/9b440f96-68df-4f8c-9cc2-3375f85f3af8-kube-api-access-qgbww\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.578321 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-utilities\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.681390 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-catalog-content\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.681492 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgbww\" (UniqueName: \"kubernetes.io/projected/9b440f96-68df-4f8c-9cc2-3375f85f3af8-kube-api-access-qgbww\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.681524 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-utilities\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.681865 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-catalog-content\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.682249 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-utilities\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.705654 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgbww\" (UniqueName: \"kubernetes.io/projected/9b440f96-68df-4f8c-9cc2-3375f85f3af8-kube-api-access-qgbww\") pod \"community-operators-786wl\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:53 crc kubenswrapper[4672]: I0930 13:22:53.780518 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:22:54 crc kubenswrapper[4672]: I0930 13:22:54.296137 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-786wl"] Sep 30 13:22:54 crc kubenswrapper[4672]: I0930 13:22:54.549028 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerID="b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904" exitCode=0 Sep 30 13:22:54 crc kubenswrapper[4672]: I0930 13:22:54.549069 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-786wl" event={"ID":"9b440f96-68df-4f8c-9cc2-3375f85f3af8","Type":"ContainerDied","Data":"b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904"} Sep 30 13:22:54 crc kubenswrapper[4672]: I0930 13:22:54.549106 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-786wl" event={"ID":"9b440f96-68df-4f8c-9cc2-3375f85f3af8","Type":"ContainerStarted","Data":"fd83fef22c6c63babfcd1b68b10419ff68b2a3985c3a80dd7b17385238fb3f61"} Sep 30 13:22:54 crc kubenswrapper[4672]: I0930 13:22:54.551566 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:22:56 crc kubenswrapper[4672]: I0930 13:22:56.569663 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerID="6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c" exitCode=0 Sep 30 13:22:56 crc kubenswrapper[4672]: I0930 13:22:56.569745 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-786wl" event={"ID":"9b440f96-68df-4f8c-9cc2-3375f85f3af8","Type":"ContainerDied","Data":"6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c"} Sep 30 13:22:58 crc kubenswrapper[4672]: I0930 13:22:58.590522 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-786wl" event={"ID":"9b440f96-68df-4f8c-9cc2-3375f85f3af8","Type":"ContainerStarted","Data":"54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2"} Sep 30 13:22:58 crc kubenswrapper[4672]: I0930 13:22:58.612313 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-786wl" podStartSLOduration=2.35431516 podStartE2EDuration="5.612291366s" podCreationTimestamp="2025-09-30 13:22:53 +0000 UTC" firstStartedPulling="2025-09-30 13:22:54.55117357 +0000 UTC m=+3665.820411216" lastFinishedPulling="2025-09-30 13:22:57.809149776 +0000 UTC m=+3669.078387422" observedRunningTime="2025-09-30 13:22:58.605850213 +0000 UTC m=+3669.875087869" watchObservedRunningTime="2025-09-30 13:22:58.612291366 +0000 UTC m=+3669.881529012" Sep 30 13:23:03 crc kubenswrapper[4672]: I0930 13:23:03.781453 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:23:03 crc kubenswrapper[4672]: I0930 13:23:03.783231 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:23:03 crc kubenswrapper[4672]: I0930 13:23:03.848981 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:23:04 crc kubenswrapper[4672]: I0930 13:23:04.720197 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:23:04 crc kubenswrapper[4672]: I0930 13:23:04.782242 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-786wl"] Sep 30 13:23:06 crc kubenswrapper[4672]: I0930 13:23:06.671320 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-786wl" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="registry-server" containerID="cri-o://54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2" gracePeriod=2 Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.154807 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.281754 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-catalog-content\") pod \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.281962 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgbww\" (UniqueName: \"kubernetes.io/projected/9b440f96-68df-4f8c-9cc2-3375f85f3af8-kube-api-access-qgbww\") pod \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.282246 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-utilities\") pod \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\" (UID: \"9b440f96-68df-4f8c-9cc2-3375f85f3af8\") " Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.283377 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-utilities" (OuterVolumeSpecName: "utilities") pod "9b440f96-68df-4f8c-9cc2-3375f85f3af8" (UID: "9b440f96-68df-4f8c-9cc2-3375f85f3af8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.291391 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b440f96-68df-4f8c-9cc2-3375f85f3af8-kube-api-access-qgbww" (OuterVolumeSpecName: "kube-api-access-qgbww") pod "9b440f96-68df-4f8c-9cc2-3375f85f3af8" (UID: "9b440f96-68df-4f8c-9cc2-3375f85f3af8"). InnerVolumeSpecName "kube-api-access-qgbww". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.349246 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b440f96-68df-4f8c-9cc2-3375f85f3af8" (UID: "9b440f96-68df-4f8c-9cc2-3375f85f3af8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.383792 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.383843 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b440f96-68df-4f8c-9cc2-3375f85f3af8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.383856 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgbww\" (UniqueName: \"kubernetes.io/projected/9b440f96-68df-4f8c-9cc2-3375f85f3af8-kube-api-access-qgbww\") on node \"crc\" DevicePath \"\"" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.692298 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-786wl" event={"ID":"9b440f96-68df-4f8c-9cc2-3375f85f3af8","Type":"ContainerDied","Data":"54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2"} Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.692322 4672 generic.go:334] "Generic (PLEG): container finished" podID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerID="54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2" exitCode=0 Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.692361 4672 scope.go:117] "RemoveContainer" containerID="54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.692373 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-786wl" event={"ID":"9b440f96-68df-4f8c-9cc2-3375f85f3af8","Type":"ContainerDied","Data":"fd83fef22c6c63babfcd1b68b10419ff68b2a3985c3a80dd7b17385238fb3f61"} Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.692400 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-786wl" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.726440 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-786wl"] Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.732562 4672 scope.go:117] "RemoveContainer" containerID="6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.734460 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-786wl"] Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.757598 4672 scope.go:117] "RemoveContainer" containerID="b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.802056 4672 scope.go:117] "RemoveContainer" containerID="54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2" Sep 30 13:23:07 crc kubenswrapper[4672]: E0930 13:23:07.802558 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2\": container with ID starting with 54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2 not found: ID does not exist" containerID="54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.802601 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2"} err="failed to get container status \"54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2\": rpc error: code = NotFound desc = could not find container \"54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2\": container with ID starting with 54de9d0e213cfaa75aaabddaa66a62e7a15690146aa964833f8ddf9e5cd7e2a2 not found: ID does not exist" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.802627 4672 scope.go:117] "RemoveContainer" containerID="6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c" Sep 30 13:23:07 crc kubenswrapper[4672]: E0930 13:23:07.802888 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c\": container with ID starting with 6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c not found: ID does not exist" containerID="6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.802917 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c"} err="failed to get container status \"6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c\": rpc error: code = NotFound desc = could not find container \"6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c\": container with ID starting with 6ddd72a7149b55f648062adbfb492c0409d4d79edcdcbea1d6e281c8a09aff9c not found: ID does not exist" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.802930 4672 scope.go:117] "RemoveContainer" containerID="b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904" Sep 30 13:23:07 crc kubenswrapper[4672]: E0930 13:23:07.803159 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904\": container with ID starting with b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904 not found: ID does not exist" containerID="b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904" Sep 30 13:23:07 crc kubenswrapper[4672]: I0930 13:23:07.803189 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904"} err="failed to get container status \"b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904\": rpc error: code = NotFound desc = could not find container \"b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904\": container with ID starting with b51611ce3f50e3726d4d0740762ee108e3bbb4927d70f9464703a1c37ff6e904 not found: ID does not exist" Sep 30 13:23:09 crc kubenswrapper[4672]: I0930 13:23:09.432032 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" path="/var/lib/kubelet/pods/9b440f96-68df-4f8c-9cc2-3375f85f3af8/volumes" Sep 30 13:24:24 crc kubenswrapper[4672]: I0930 13:24:24.739833 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:24:24 crc kubenswrapper[4672]: I0930 13:24:24.740373 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:24:54 crc kubenswrapper[4672]: I0930 13:24:54.739599 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:24:54 crc kubenswrapper[4672]: I0930 13:24:54.741046 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:25:24 crc kubenswrapper[4672]: I0930 13:25:24.740135 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:25:24 crc kubenswrapper[4672]: I0930 13:25:24.740911 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:25:24 crc kubenswrapper[4672]: I0930 13:25:24.740991 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:25:25 crc kubenswrapper[4672]: I0930 13:25:25.012093 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:25:25 crc kubenswrapper[4672]: I0930 13:25:25.012230 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" gracePeriod=600 Sep 30 13:25:25 crc kubenswrapper[4672]: E0930 13:25:25.146901 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:25:26 crc kubenswrapper[4672]: I0930 13:25:26.023848 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" exitCode=0 Sep 30 13:25:26 crc kubenswrapper[4672]: I0930 13:25:26.023922 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2"} Sep 30 13:25:26 crc kubenswrapper[4672]: I0930 13:25:26.024240 4672 scope.go:117] "RemoveContainer" containerID="6b2b209d5c83cc2f1bb5c85320d1859fc6933d14d7e8096d1dfad91648be07d1" Sep 30 13:25:26 crc kubenswrapper[4672]: I0930 13:25:26.025204 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:25:26 crc kubenswrapper[4672]: E0930 13:25:26.025628 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:25:40 crc kubenswrapper[4672]: I0930 13:25:40.416937 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:25:40 crc kubenswrapper[4672]: E0930 13:25:40.417826 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:25:54 crc kubenswrapper[4672]: I0930 13:25:54.427035 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:25:54 crc kubenswrapper[4672]: E0930 13:25:54.428992 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:26:06 crc kubenswrapper[4672]: I0930 13:26:06.417607 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:26:06 crc kubenswrapper[4672]: E0930 13:26:06.418867 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:26:19 crc kubenswrapper[4672]: I0930 13:26:19.426018 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:26:19 crc kubenswrapper[4672]: E0930 13:26:19.427009 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:26:33 crc kubenswrapper[4672]: I0930 13:26:33.417518 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:26:33 crc kubenswrapper[4672]: E0930 13:26:33.418362 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:26:46 crc kubenswrapper[4672]: I0930 13:26:46.417793 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:26:46 crc kubenswrapper[4672]: E0930 13:26:46.419787 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:27:00 crc kubenswrapper[4672]: I0930 13:27:00.417060 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:27:00 crc kubenswrapper[4672]: E0930 13:27:00.417877 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:27:12 crc kubenswrapper[4672]: I0930 13:27:12.418190 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:27:12 crc kubenswrapper[4672]: E0930 13:27:12.419101 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:27:24 crc kubenswrapper[4672]: I0930 13:27:24.417112 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:27:24 crc kubenswrapper[4672]: E0930 13:27:24.418144 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:27:37 crc kubenswrapper[4672]: I0930 13:27:37.417410 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:27:37 crc kubenswrapper[4672]: E0930 13:27:37.418248 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:27:51 crc kubenswrapper[4672]: I0930 13:27:51.417500 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:27:51 crc kubenswrapper[4672]: E0930 13:27:51.418669 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:28:05 crc kubenswrapper[4672]: I0930 13:28:05.417189 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:28:05 crc kubenswrapper[4672]: E0930 13:28:05.418114 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:28:19 crc kubenswrapper[4672]: I0930 13:28:19.422053 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:28:19 crc kubenswrapper[4672]: E0930 13:28:19.422985 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:28:32 crc kubenswrapper[4672]: I0930 13:28:32.417694 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:28:32 crc kubenswrapper[4672]: E0930 13:28:32.418614 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:28:43 crc kubenswrapper[4672]: I0930 13:28:43.417689 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:28:43 crc kubenswrapper[4672]: E0930 13:28:43.418485 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:28:57 crc kubenswrapper[4672]: I0930 13:28:57.417469 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:28:57 crc kubenswrapper[4672]: E0930 13:28:57.418583 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:29:10 crc kubenswrapper[4672]: I0930 13:29:10.431626 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:29:10 crc kubenswrapper[4672]: E0930 13:29:10.434246 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.621830 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2h9w"] Sep 30 13:29:13 crc kubenswrapper[4672]: E0930 13:29:13.623024 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="extract-content" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.623056 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="extract-content" Sep 30 13:29:13 crc kubenswrapper[4672]: E0930 13:29:13.623174 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="registry-server" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.623192 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="registry-server" Sep 30 13:29:13 crc kubenswrapper[4672]: E0930 13:29:13.623220 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="extract-utilities" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.623238 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="extract-utilities" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.623718 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b440f96-68df-4f8c-9cc2-3375f85f3af8" containerName="registry-server" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.626770 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.636134 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2h9w"] Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.747413 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-utilities\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.747769 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np9kd\" (UniqueName: \"kubernetes.io/projected/83f72f3e-5f6b-4798-82d0-b294a5d19f78-kube-api-access-np9kd\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.748012 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-catalog-content\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.850073 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-utilities\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.850468 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np9kd\" (UniqueName: \"kubernetes.io/projected/83f72f3e-5f6b-4798-82d0-b294a5d19f78-kube-api-access-np9kd\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.850543 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-catalog-content\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.851176 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-utilities\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:13 crc kubenswrapper[4672]: I0930 13:29:13.851191 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-catalog-content\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:14 crc kubenswrapper[4672]: I0930 13:29:14.247567 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np9kd\" (UniqueName: \"kubernetes.io/projected/83f72f3e-5f6b-4798-82d0-b294a5d19f78-kube-api-access-np9kd\") pod \"redhat-marketplace-x2h9w\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:14 crc kubenswrapper[4672]: I0930 13:29:14.275706 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:14 crc kubenswrapper[4672]: I0930 13:29:14.785659 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2h9w"] Sep 30 13:29:15 crc kubenswrapper[4672]: I0930 13:29:15.423944 4672 generic.go:334] "Generic (PLEG): container finished" podID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerID="858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c" exitCode=0 Sep 30 13:29:15 crc kubenswrapper[4672]: I0930 13:29:15.426126 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:29:15 crc kubenswrapper[4672]: I0930 13:29:15.429057 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2h9w" event={"ID":"83f72f3e-5f6b-4798-82d0-b294a5d19f78","Type":"ContainerDied","Data":"858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c"} Sep 30 13:29:15 crc kubenswrapper[4672]: I0930 13:29:15.429097 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2h9w" event={"ID":"83f72f3e-5f6b-4798-82d0-b294a5d19f78","Type":"ContainerStarted","Data":"88f1d45e954e0c5decfa0034e7544d54034e299658b157b567a5c409a0dbdf17"} Sep 30 13:29:17 crc kubenswrapper[4672]: I0930 13:29:17.445830 4672 generic.go:334] "Generic (PLEG): container finished" podID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerID="8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4" exitCode=0 Sep 30 13:29:17 crc kubenswrapper[4672]: I0930 13:29:17.445974 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2h9w" event={"ID":"83f72f3e-5f6b-4798-82d0-b294a5d19f78","Type":"ContainerDied","Data":"8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4"} Sep 30 13:29:18 crc kubenswrapper[4672]: I0930 13:29:18.459699 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2h9w" event={"ID":"83f72f3e-5f6b-4798-82d0-b294a5d19f78","Type":"ContainerStarted","Data":"b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c"} Sep 30 13:29:18 crc kubenswrapper[4672]: I0930 13:29:18.483772 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2h9w" podStartSLOduration=2.98701328 podStartE2EDuration="5.483747675s" podCreationTimestamp="2025-09-30 13:29:13 +0000 UTC" firstStartedPulling="2025-09-30 13:29:15.425744589 +0000 UTC m=+4046.694982275" lastFinishedPulling="2025-09-30 13:29:17.922479004 +0000 UTC m=+4049.191716670" observedRunningTime="2025-09-30 13:29:18.478307788 +0000 UTC m=+4049.747545474" watchObservedRunningTime="2025-09-30 13:29:18.483747675 +0000 UTC m=+4049.752985321" Sep 30 13:29:22 crc kubenswrapper[4672]: I0930 13:29:22.417696 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:29:22 crc kubenswrapper[4672]: E0930 13:29:22.418536 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:29:24 crc kubenswrapper[4672]: I0930 13:29:24.276922 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:24 crc kubenswrapper[4672]: I0930 13:29:24.277714 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:24 crc kubenswrapper[4672]: I0930 13:29:24.351206 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:24 crc kubenswrapper[4672]: I0930 13:29:24.571134 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:24 crc kubenswrapper[4672]: I0930 13:29:24.642170 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2h9w"] Sep 30 13:29:26 crc kubenswrapper[4672]: I0930 13:29:26.545080 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2h9w" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="registry-server" containerID="cri-o://b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c" gracePeriod=2 Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.064557 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.173325 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-catalog-content\") pod \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.173716 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-utilities\") pod \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.173777 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np9kd\" (UniqueName: \"kubernetes.io/projected/83f72f3e-5f6b-4798-82d0-b294a5d19f78-kube-api-access-np9kd\") pod \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\" (UID: \"83f72f3e-5f6b-4798-82d0-b294a5d19f78\") " Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.174695 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-utilities" (OuterVolumeSpecName: "utilities") pod "83f72f3e-5f6b-4798-82d0-b294a5d19f78" (UID: "83f72f3e-5f6b-4798-82d0-b294a5d19f78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.174976 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.181105 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f72f3e-5f6b-4798-82d0-b294a5d19f78-kube-api-access-np9kd" (OuterVolumeSpecName: "kube-api-access-np9kd") pod "83f72f3e-5f6b-4798-82d0-b294a5d19f78" (UID: "83f72f3e-5f6b-4798-82d0-b294a5d19f78"). InnerVolumeSpecName "kube-api-access-np9kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.187961 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83f72f3e-5f6b-4798-82d0-b294a5d19f78" (UID: "83f72f3e-5f6b-4798-82d0-b294a5d19f78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.276983 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np9kd\" (UniqueName: \"kubernetes.io/projected/83f72f3e-5f6b-4798-82d0-b294a5d19f78-kube-api-access-np9kd\") on node \"crc\" DevicePath \"\"" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.277100 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f72f3e-5f6b-4798-82d0-b294a5d19f78-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.562537 4672 generic.go:334] "Generic (PLEG): container finished" podID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerID="b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c" exitCode=0 Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.564101 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2h9w" event={"ID":"83f72f3e-5f6b-4798-82d0-b294a5d19f78","Type":"ContainerDied","Data":"b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c"} Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.564300 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2h9w" event={"ID":"83f72f3e-5f6b-4798-82d0-b294a5d19f78","Type":"ContainerDied","Data":"88f1d45e954e0c5decfa0034e7544d54034e299658b157b567a5c409a0dbdf17"} Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.564126 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2h9w" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.564356 4672 scope.go:117] "RemoveContainer" containerID="b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.590167 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2h9w"] Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.600738 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2h9w"] Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.601545 4672 scope.go:117] "RemoveContainer" containerID="8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.633174 4672 scope.go:117] "RemoveContainer" containerID="858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.689020 4672 scope.go:117] "RemoveContainer" containerID="b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c" Sep 30 13:29:27 crc kubenswrapper[4672]: E0930 13:29:27.689781 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c\": container with ID starting with b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c not found: ID does not exist" containerID="b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.689840 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c"} err="failed to get container status \"b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c\": rpc error: code = NotFound desc = could not find container \"b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c\": container with ID starting with b874774d70f32c003f6eddb712790424cd4fcf979a348a7dab5b072673b4734c not found: ID does not exist" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.689872 4672 scope.go:117] "RemoveContainer" containerID="8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4" Sep 30 13:29:27 crc kubenswrapper[4672]: E0930 13:29:27.690303 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4\": container with ID starting with 8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4 not found: ID does not exist" containerID="8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.690358 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4"} err="failed to get container status \"8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4\": rpc error: code = NotFound desc = could not find container \"8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4\": container with ID starting with 8bd4d0ea590c5b600dba3e575c9d1af18960bf1b3e07e0deaceed9b81b5ff8d4 not found: ID does not exist" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.690393 4672 scope.go:117] "RemoveContainer" containerID="858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c" Sep 30 13:29:27 crc kubenswrapper[4672]: E0930 13:29:27.690687 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c\": container with ID starting with 858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c not found: ID does not exist" containerID="858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c" Sep 30 13:29:27 crc kubenswrapper[4672]: I0930 13:29:27.690721 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c"} err="failed to get container status \"858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c\": rpc error: code = NotFound desc = could not find container \"858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c\": container with ID starting with 858ece026be1d9d7fe68aaa3909e6b34f2ea29da0c4b4c6683493d6ac8478c0c not found: ID does not exist" Sep 30 13:29:29 crc kubenswrapper[4672]: I0930 13:29:29.428249 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" path="/var/lib/kubelet/pods/83f72f3e-5f6b-4798-82d0-b294a5d19f78/volumes" Sep 30 13:29:37 crc kubenswrapper[4672]: I0930 13:29:37.416703 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:29:37 crc kubenswrapper[4672]: E0930 13:29:37.417520 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:29:51 crc kubenswrapper[4672]: I0930 13:29:51.417708 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:29:51 crc kubenswrapper[4672]: E0930 13:29:51.418452 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.168908 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c"] Sep 30 13:30:00 crc kubenswrapper[4672]: E0930 13:30:00.169997 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="extract-content" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.170121 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="extract-content" Sep 30 13:30:00 crc kubenswrapper[4672]: E0930 13:30:00.170141 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="extract-utilities" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.170149 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="extract-utilities" Sep 30 13:30:00 crc kubenswrapper[4672]: E0930 13:30:00.170188 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="registry-server" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.170196 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="registry-server" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.170474 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f72f3e-5f6b-4798-82d0-b294a5d19f78" containerName="registry-server" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.171354 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.173200 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.173312 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.194064 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c"] Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.285498 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqz8b\" (UniqueName: \"kubernetes.io/projected/f7fa99b2-4365-47ce-bc55-187c5385dde1-kube-api-access-pqz8b\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.285830 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7fa99b2-4365-47ce-bc55-187c5385dde1-config-volume\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.285961 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7fa99b2-4365-47ce-bc55-187c5385dde1-secret-volume\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.387756 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7fa99b2-4365-47ce-bc55-187c5385dde1-secret-volume\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.387901 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqz8b\" (UniqueName: \"kubernetes.io/projected/f7fa99b2-4365-47ce-bc55-187c5385dde1-kube-api-access-pqz8b\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.388138 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7fa99b2-4365-47ce-bc55-187c5385dde1-config-volume\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.390365 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7fa99b2-4365-47ce-bc55-187c5385dde1-config-volume\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.394913 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7fa99b2-4365-47ce-bc55-187c5385dde1-secret-volume\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.411485 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqz8b\" (UniqueName: \"kubernetes.io/projected/f7fa99b2-4365-47ce-bc55-187c5385dde1-kube-api-access-pqz8b\") pod \"collect-profiles-29320650-9c78c\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.495447 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:00 crc kubenswrapper[4672]: I0930 13:30:00.983895 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c"] Sep 30 13:30:01 crc kubenswrapper[4672]: I0930 13:30:01.883351 4672 generic.go:334] "Generic (PLEG): container finished" podID="f7fa99b2-4365-47ce-bc55-187c5385dde1" containerID="59dc8d94013d0e3da1d4f12943b77d4155eb0e0a63d531eb54bc9aaef3792cce" exitCode=0 Sep 30 13:30:01 crc kubenswrapper[4672]: I0930 13:30:01.883399 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" event={"ID":"f7fa99b2-4365-47ce-bc55-187c5385dde1","Type":"ContainerDied","Data":"59dc8d94013d0e3da1d4f12943b77d4155eb0e0a63d531eb54bc9aaef3792cce"} Sep 30 13:30:01 crc kubenswrapper[4672]: I0930 13:30:01.883674 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" event={"ID":"f7fa99b2-4365-47ce-bc55-187c5385dde1","Type":"ContainerStarted","Data":"d73f7fc93b8447ba99709f1faf8a9f180a8c37aee31ee9eef7932f55eafcc0ae"} Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.322302 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.360620 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7fa99b2-4365-47ce-bc55-187c5385dde1-secret-volume\") pod \"f7fa99b2-4365-47ce-bc55-187c5385dde1\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.360720 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7fa99b2-4365-47ce-bc55-187c5385dde1-config-volume\") pod \"f7fa99b2-4365-47ce-bc55-187c5385dde1\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.360793 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqz8b\" (UniqueName: \"kubernetes.io/projected/f7fa99b2-4365-47ce-bc55-187c5385dde1-kube-api-access-pqz8b\") pod \"f7fa99b2-4365-47ce-bc55-187c5385dde1\" (UID: \"f7fa99b2-4365-47ce-bc55-187c5385dde1\") " Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.362804 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7fa99b2-4365-47ce-bc55-187c5385dde1-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7fa99b2-4365-47ce-bc55-187c5385dde1" (UID: "f7fa99b2-4365-47ce-bc55-187c5385dde1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.367243 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7fa99b2-4365-47ce-bc55-187c5385dde1-kube-api-access-pqz8b" (OuterVolumeSpecName: "kube-api-access-pqz8b") pod "f7fa99b2-4365-47ce-bc55-187c5385dde1" (UID: "f7fa99b2-4365-47ce-bc55-187c5385dde1"). InnerVolumeSpecName "kube-api-access-pqz8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.370237 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7fa99b2-4365-47ce-bc55-187c5385dde1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7fa99b2-4365-47ce-bc55-187c5385dde1" (UID: "f7fa99b2-4365-47ce-bc55-187c5385dde1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.464207 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqz8b\" (UniqueName: \"kubernetes.io/projected/f7fa99b2-4365-47ce-bc55-187c5385dde1-kube-api-access-pqz8b\") on node \"crc\" DevicePath \"\"" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.464255 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7fa99b2-4365-47ce-bc55-187c5385dde1-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.464314 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7fa99b2-4365-47ce-bc55-187c5385dde1-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.911528 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" event={"ID":"f7fa99b2-4365-47ce-bc55-187c5385dde1","Type":"ContainerDied","Data":"d73f7fc93b8447ba99709f1faf8a9f180a8c37aee31ee9eef7932f55eafcc0ae"} Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.911589 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d73f7fc93b8447ba99709f1faf8a9f180a8c37aee31ee9eef7932f55eafcc0ae" Sep 30 13:30:03 crc kubenswrapper[4672]: I0930 13:30:03.911663 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320650-9c78c" Sep 30 13:30:04 crc kubenswrapper[4672]: I0930 13:30:04.397771 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd"] Sep 30 13:30:04 crc kubenswrapper[4672]: I0930 13:30:04.406066 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320605-4j4hd"] Sep 30 13:30:05 crc kubenswrapper[4672]: I0930 13:30:05.417217 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:30:05 crc kubenswrapper[4672]: E0930 13:30:05.417864 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:30:05 crc kubenswrapper[4672]: I0930 13:30:05.429575 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854490c3-8e67-4668-a14f-8af3d1b0a8f5" path="/var/lib/kubelet/pods/854490c3-8e67-4668-a14f-8af3d1b0a8f5/volumes" Sep 30 13:30:20 crc kubenswrapper[4672]: I0930 13:30:20.417190 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:30:20 crc kubenswrapper[4672]: E0930 13:30:20.418190 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:30:31 crc kubenswrapper[4672]: I0930 13:30:31.418004 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:30:32 crc kubenswrapper[4672]: I0930 13:30:32.188547 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"caf17acdef1b279f2d0357a079f11b366c5613a1cdad739f51759198b31aa0ff"} Sep 30 13:31:02 crc kubenswrapper[4672]: I0930 13:31:02.594458 4672 scope.go:117] "RemoveContainer" containerID="bb76c886b5e0e0e89d1deed8a5bcbbdcf270e0c6c575c973f3aa57489e481abc" Sep 30 13:31:45 crc kubenswrapper[4672]: I0930 13:31:45.909512 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x7dcm"] Sep 30 13:31:45 crc kubenswrapper[4672]: E0930 13:31:45.910433 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7fa99b2-4365-47ce-bc55-187c5385dde1" containerName="collect-profiles" Sep 30 13:31:45 crc kubenswrapper[4672]: I0930 13:31:45.910453 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7fa99b2-4365-47ce-bc55-187c5385dde1" containerName="collect-profiles" Sep 30 13:31:45 crc kubenswrapper[4672]: I0930 13:31:45.910685 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7fa99b2-4365-47ce-bc55-187c5385dde1" containerName="collect-profiles" Sep 30 13:31:45 crc kubenswrapper[4672]: I0930 13:31:45.912331 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:45 crc kubenswrapper[4672]: I0930 13:31:45.933624 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7dcm"] Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.018760 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkc8\" (UniqueName: \"kubernetes.io/projected/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-kube-api-access-pmkc8\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.019097 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-utilities\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.019511 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-catalog-content\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.122465 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-utilities\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.122620 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-catalog-content\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.122669 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkc8\" (UniqueName: \"kubernetes.io/projected/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-kube-api-access-pmkc8\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.123244 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-utilities\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.123312 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-catalog-content\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.445033 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkc8\" (UniqueName: \"kubernetes.io/projected/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-kube-api-access-pmkc8\") pod \"redhat-operators-x7dcm\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:46 crc kubenswrapper[4672]: I0930 13:31:46.542797 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:47 crc kubenswrapper[4672]: I0930 13:31:47.066487 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7dcm"] Sep 30 13:31:47 crc kubenswrapper[4672]: I0930 13:31:47.951797 4672 generic.go:334] "Generic (PLEG): container finished" podID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerID="a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f" exitCode=0 Sep 30 13:31:47 crc kubenswrapper[4672]: I0930 13:31:47.951863 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7dcm" event={"ID":"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2","Type":"ContainerDied","Data":"a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f"} Sep 30 13:31:47 crc kubenswrapper[4672]: I0930 13:31:47.952296 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7dcm" event={"ID":"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2","Type":"ContainerStarted","Data":"f8be84fd34e421d801dd1b1c68aa6abd542ab4b62e8ca6d6fb2620682e3b7a9e"} Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.310220 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xjrwz"] Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.312660 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.324688 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjrwz"] Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.476217 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-utilities\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.476309 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-catalog-content\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.476425 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4x5\" (UniqueName: \"kubernetes.io/projected/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-kube-api-access-4m4x5\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.578332 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4x5\" (UniqueName: \"kubernetes.io/projected/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-kube-api-access-4m4x5\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.579926 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-utilities\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.579972 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-utilities\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.580094 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-catalog-content\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.580398 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-catalog-content\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.615181 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4x5\" (UniqueName: \"kubernetes.io/projected/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-kube-api-access-4m4x5\") pod \"certified-operators-xjrwz\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:48 crc kubenswrapper[4672]: I0930 13:31:48.632722 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:49 crc kubenswrapper[4672]: I0930 13:31:49.205407 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjrwz"] Sep 30 13:31:49 crc kubenswrapper[4672]: I0930 13:31:49.998647 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerStarted","Data":"d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9"} Sep 30 13:31:50 crc kubenswrapper[4672]: I0930 13:31:50.001113 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerStarted","Data":"13784ea6745d6eee42110737bb1ed54350cade301a433e584ebf5d655bff1ea6"} Sep 30 13:31:50 crc kubenswrapper[4672]: I0930 13:31:50.003619 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7dcm" event={"ID":"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2","Type":"ContainerStarted","Data":"3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b"} Sep 30 13:31:51 crc kubenswrapper[4672]: I0930 13:31:51.018981 4672 generic.go:334] "Generic (PLEG): container finished" podID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerID="3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b" exitCode=0 Sep 30 13:31:51 crc kubenswrapper[4672]: I0930 13:31:51.019082 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7dcm" event={"ID":"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2","Type":"ContainerDied","Data":"3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b"} Sep 30 13:31:51 crc kubenswrapper[4672]: I0930 13:31:51.020577 4672 generic.go:334] "Generic (PLEG): container finished" podID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerID="d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9" exitCode=0 Sep 30 13:31:51 crc kubenswrapper[4672]: I0930 13:31:51.020619 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerDied","Data":"d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9"} Sep 30 13:31:51 crc kubenswrapper[4672]: E0930 13:31:51.146060 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde7eb386_d6a3_4cf0_9038_7cdafd4ada30.slice/crio-d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:31:52 crc kubenswrapper[4672]: I0930 13:31:52.032982 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7dcm" event={"ID":"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2","Type":"ContainerStarted","Data":"620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545"} Sep 30 13:31:52 crc kubenswrapper[4672]: I0930 13:31:52.037779 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerStarted","Data":"317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc"} Sep 30 13:31:52 crc kubenswrapper[4672]: I0930 13:31:52.074993 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x7dcm" podStartSLOduration=3.507085815 podStartE2EDuration="7.07497574s" podCreationTimestamp="2025-09-30 13:31:45 +0000 UTC" firstStartedPulling="2025-09-30 13:31:47.953828836 +0000 UTC m=+4199.223066482" lastFinishedPulling="2025-09-30 13:31:51.521718771 +0000 UTC m=+4202.790956407" observedRunningTime="2025-09-30 13:31:52.062705671 +0000 UTC m=+4203.331943327" watchObservedRunningTime="2025-09-30 13:31:52.07497574 +0000 UTC m=+4203.344213386" Sep 30 13:31:53 crc kubenswrapper[4672]: I0930 13:31:53.049206 4672 generic.go:334] "Generic (PLEG): container finished" podID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerID="317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc" exitCode=0 Sep 30 13:31:53 crc kubenswrapper[4672]: I0930 13:31:53.050563 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerDied","Data":"317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc"} Sep 30 13:31:54 crc kubenswrapper[4672]: I0930 13:31:54.059060 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerStarted","Data":"d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8"} Sep 30 13:31:54 crc kubenswrapper[4672]: I0930 13:31:54.076905 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xjrwz" podStartSLOduration=3.590632854 podStartE2EDuration="6.076888557s" podCreationTimestamp="2025-09-30 13:31:48 +0000 UTC" firstStartedPulling="2025-09-30 13:31:51.022999513 +0000 UTC m=+4202.292237189" lastFinishedPulling="2025-09-30 13:31:53.509255236 +0000 UTC m=+4204.778492892" observedRunningTime="2025-09-30 13:31:54.072796314 +0000 UTC m=+4205.342033980" watchObservedRunningTime="2025-09-30 13:31:54.076888557 +0000 UTC m=+4205.346126203" Sep 30 13:31:56 crc kubenswrapper[4672]: I0930 13:31:56.543045 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:56 crc kubenswrapper[4672]: I0930 13:31:56.543398 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:56 crc kubenswrapper[4672]: I0930 13:31:56.594724 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:57 crc kubenswrapper[4672]: I0930 13:31:57.476661 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:57 crc kubenswrapper[4672]: I0930 13:31:57.900161 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7dcm"] Sep 30 13:31:58 crc kubenswrapper[4672]: I0930 13:31:58.633562 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:58 crc kubenswrapper[4672]: I0930 13:31:58.633920 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:58 crc kubenswrapper[4672]: I0930 13:31:58.694320 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.105433 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x7dcm" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="registry-server" containerID="cri-o://620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545" gracePeriod=2 Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.381002 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.707561 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.805828 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-utilities\") pod \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.806023 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmkc8\" (UniqueName: \"kubernetes.io/projected/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-kube-api-access-pmkc8\") pod \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.806068 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-catalog-content\") pod \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\" (UID: \"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2\") " Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.807409 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-utilities" (OuterVolumeSpecName: "utilities") pod "4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" (UID: "4a72dc34-6098-4bff-83d2-fb4a8fbdeda2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.818120 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-kube-api-access-pmkc8" (OuterVolumeSpecName: "kube-api-access-pmkc8") pod "4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" (UID: "4a72dc34-6098-4bff-83d2-fb4a8fbdeda2"). InnerVolumeSpecName "kube-api-access-pmkc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.898417 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" (UID: "4a72dc34-6098-4bff-83d2-fb4a8fbdeda2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.908996 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmkc8\" (UniqueName: \"kubernetes.io/projected/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-kube-api-access-pmkc8\") on node \"crc\" DevicePath \"\"" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.909032 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:31:59 crc kubenswrapper[4672]: I0930 13:31:59.909044 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.115153 4672 generic.go:334] "Generic (PLEG): container finished" podID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerID="620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545" exitCode=0 Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.115220 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7dcm" event={"ID":"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2","Type":"ContainerDied","Data":"620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545"} Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.115301 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7dcm" event={"ID":"4a72dc34-6098-4bff-83d2-fb4a8fbdeda2","Type":"ContainerDied","Data":"f8be84fd34e421d801dd1b1c68aa6abd542ab4b62e8ca6d6fb2620682e3b7a9e"} Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.115327 4672 scope.go:117] "RemoveContainer" containerID="620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.116456 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7dcm" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.140072 4672 scope.go:117] "RemoveContainer" containerID="3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.150660 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7dcm"] Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.159994 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x7dcm"] Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.175937 4672 scope.go:117] "RemoveContainer" containerID="a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.216040 4672 scope.go:117] "RemoveContainer" containerID="620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545" Sep 30 13:32:00 crc kubenswrapper[4672]: E0930 13:32:00.216830 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545\": container with ID starting with 620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545 not found: ID does not exist" containerID="620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.216870 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545"} err="failed to get container status \"620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545\": rpc error: code = NotFound desc = could not find container \"620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545\": container with ID starting with 620027637a771f2ac3f53c12efd5cfa7dd690f3302da83aa370734037f2d6545 not found: ID does not exist" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.216899 4672 scope.go:117] "RemoveContainer" containerID="3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b" Sep 30 13:32:00 crc kubenswrapper[4672]: E0930 13:32:00.217302 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b\": container with ID starting with 3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b not found: ID does not exist" containerID="3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.217319 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b"} err="failed to get container status \"3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b\": rpc error: code = NotFound desc = could not find container \"3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b\": container with ID starting with 3a2857b51f0349e9f214781c321adeb7ee396831f9d9c59f3879a9c40f0f338b not found: ID does not exist" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.217330 4672 scope.go:117] "RemoveContainer" containerID="a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f" Sep 30 13:32:00 crc kubenswrapper[4672]: E0930 13:32:00.217550 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f\": container with ID starting with a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f not found: ID does not exist" containerID="a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.217568 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f"} err="failed to get container status \"a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f\": rpc error: code = NotFound desc = could not find container \"a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f\": container with ID starting with a2e010afa4f26a3cb4c96c8f2a3a2aa3aff65af576e14acde8c621796729947f not found: ID does not exist" Sep 30 13:32:00 crc kubenswrapper[4672]: I0930 13:32:00.705618 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjrwz"] Sep 30 13:32:01 crc kubenswrapper[4672]: E0930 13:32:01.418388 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde7eb386_d6a3_4cf0_9038_7cdafd4ada30.slice/crio-d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:32:01 crc kubenswrapper[4672]: I0930 13:32:01.430582 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" path="/var/lib/kubelet/pods/4a72dc34-6098-4bff-83d2-fb4a8fbdeda2/volumes" Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.138815 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xjrwz" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="registry-server" containerID="cri-o://d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8" gracePeriod=2 Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.640913 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.665052 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-catalog-content\") pod \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.665325 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m4x5\" (UniqueName: \"kubernetes.io/projected/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-kube-api-access-4m4x5\") pod \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.665480 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-utilities\") pod \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\" (UID: \"de7eb386-d6a3-4cf0-9038-7cdafd4ada30\") " Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.666190 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-utilities" (OuterVolumeSpecName: "utilities") pod "de7eb386-d6a3-4cf0-9038-7cdafd4ada30" (UID: "de7eb386-d6a3-4cf0-9038-7cdafd4ada30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.676547 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-kube-api-access-4m4x5" (OuterVolumeSpecName: "kube-api-access-4m4x5") pod "de7eb386-d6a3-4cf0-9038-7cdafd4ada30" (UID: "de7eb386-d6a3-4cf0-9038-7cdafd4ada30"). InnerVolumeSpecName "kube-api-access-4m4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.715737 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de7eb386-d6a3-4cf0-9038-7cdafd4ada30" (UID: "de7eb386-d6a3-4cf0-9038-7cdafd4ada30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.768100 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.768139 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m4x5\" (UniqueName: \"kubernetes.io/projected/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-kube-api-access-4m4x5\") on node \"crc\" DevicePath \"\"" Sep 30 13:32:02 crc kubenswrapper[4672]: I0930 13:32:02.768152 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de7eb386-d6a3-4cf0-9038-7cdafd4ada30-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.152934 4672 generic.go:334] "Generic (PLEG): container finished" podID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerID="d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8" exitCode=0 Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.152992 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerDied","Data":"d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8"} Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.153030 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjrwz" event={"ID":"de7eb386-d6a3-4cf0-9038-7cdafd4ada30","Type":"ContainerDied","Data":"13784ea6745d6eee42110737bb1ed54350cade301a433e584ebf5d655bff1ea6"} Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.153057 4672 scope.go:117] "RemoveContainer" containerID="d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.153212 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjrwz" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.178974 4672 scope.go:117] "RemoveContainer" containerID="317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.193317 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjrwz"] Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.203088 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xjrwz"] Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.222277 4672 scope.go:117] "RemoveContainer" containerID="d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.278160 4672 scope.go:117] "RemoveContainer" containerID="d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8" Sep 30 13:32:03 crc kubenswrapper[4672]: E0930 13:32:03.279460 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8\": container with ID starting with d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8 not found: ID does not exist" containerID="d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.279503 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8"} err="failed to get container status \"d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8\": rpc error: code = NotFound desc = could not find container \"d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8\": container with ID starting with d84f4449cf2ecb6358c3c05727700b46fecc1aad03485c68d162d8f2c2d449e8 not found: ID does not exist" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.279525 4672 scope.go:117] "RemoveContainer" containerID="317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc" Sep 30 13:32:03 crc kubenswrapper[4672]: E0930 13:32:03.279818 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc\": container with ID starting with 317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc not found: ID does not exist" containerID="317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.279844 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc"} err="failed to get container status \"317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc\": rpc error: code = NotFound desc = could not find container \"317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc\": container with ID starting with 317c2cd3b35b99c750adad78c129c5edde48efcdb3c802ede0f29f68da14e0bc not found: ID does not exist" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.279859 4672 scope.go:117] "RemoveContainer" containerID="d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9" Sep 30 13:32:03 crc kubenswrapper[4672]: E0930 13:32:03.280110 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9\": container with ID starting with d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9 not found: ID does not exist" containerID="d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.280153 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9"} err="failed to get container status \"d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9\": rpc error: code = NotFound desc = could not find container \"d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9\": container with ID starting with d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9 not found: ID does not exist" Sep 30 13:32:03 crc kubenswrapper[4672]: I0930 13:32:03.430240 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" path="/var/lib/kubelet/pods/de7eb386-d6a3-4cf0-9038-7cdafd4ada30/volumes" Sep 30 13:32:11 crc kubenswrapper[4672]: E0930 13:32:11.726380 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde7eb386_d6a3_4cf0_9038_7cdafd4ada30.slice/crio-d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:32:21 crc kubenswrapper[4672]: E0930 13:32:21.995753 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde7eb386_d6a3_4cf0_9038_7cdafd4ada30.slice/crio-d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:32:32 crc kubenswrapper[4672]: E0930 13:32:32.268643 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde7eb386_d6a3_4cf0_9038_7cdafd4ada30.slice/crio-d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:32:42 crc kubenswrapper[4672]: E0930 13:32:42.531583 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde7eb386_d6a3_4cf0_9038_7cdafd4ada30.slice/crio-d5d11fd8b3eb60da8024c50a00917f89c56fc1cc6dc6953877e6789aa35c49f9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:32:54 crc kubenswrapper[4672]: I0930 13:32:54.739822 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:32:54 crc kubenswrapper[4672]: I0930 13:32:54.740682 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:32:58 crc kubenswrapper[4672]: E0930 13:32:58.845735 4672 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.241:51010->38.102.83.241:40853: write tcp 38.102.83.241:51010->38.102.83.241:40853: write: broken pipe Sep 30 13:33:02 crc kubenswrapper[4672]: E0930 13:33:02.249030 4672 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.241:51108->38.102.83.241:40853: write tcp 38.102.83.241:51108->38.102.83.241:40853: write: connection reset by peer Sep 30 13:33:24 crc kubenswrapper[4672]: I0930 13:33:24.739760 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:33:24 crc kubenswrapper[4672]: I0930 13:33:24.740357 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:33:54 crc kubenswrapper[4672]: I0930 13:33:54.739921 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:33:54 crc kubenswrapper[4672]: I0930 13:33:54.740350 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:33:54 crc kubenswrapper[4672]: I0930 13:33:54.740388 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:33:54 crc kubenswrapper[4672]: I0930 13:33:54.741058 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"caf17acdef1b279f2d0357a079f11b366c5613a1cdad739f51759198b31aa0ff"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:33:54 crc kubenswrapper[4672]: I0930 13:33:54.741129 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://caf17acdef1b279f2d0357a079f11b366c5613a1cdad739f51759198b31aa0ff" gracePeriod=600 Sep 30 13:33:55 crc kubenswrapper[4672]: I0930 13:33:55.294670 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="caf17acdef1b279f2d0357a079f11b366c5613a1cdad739f51759198b31aa0ff" exitCode=0 Sep 30 13:33:55 crc kubenswrapper[4672]: I0930 13:33:55.295040 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"caf17acdef1b279f2d0357a079f11b366c5613a1cdad739f51759198b31aa0ff"} Sep 30 13:33:55 crc kubenswrapper[4672]: I0930 13:33:55.295072 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5"} Sep 30 13:33:55 crc kubenswrapper[4672]: I0930 13:33:55.295092 4672 scope.go:117] "RemoveContainer" containerID="e31569925dbee8a158ee7d2ef1cfb2bb848061058b8b7657cbbf2455b1d08de2" Sep 30 13:36:24 crc kubenswrapper[4672]: I0930 13:36:24.740121 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:36:24 crc kubenswrapper[4672]: I0930 13:36:24.740748 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:36:54 crc kubenswrapper[4672]: I0930 13:36:54.739518 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:36:54 crc kubenswrapper[4672]: I0930 13:36:54.740099 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.659869 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fdqnv"] Sep 30 13:37:23 crc kubenswrapper[4672]: E0930 13:37:23.661168 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="extract-content" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661188 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="extract-content" Sep 30 13:37:23 crc kubenswrapper[4672]: E0930 13:37:23.661213 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="registry-server" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661221 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="registry-server" Sep 30 13:37:23 crc kubenswrapper[4672]: E0930 13:37:23.661237 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="extract-content" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661246 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="extract-content" Sep 30 13:37:23 crc kubenswrapper[4672]: E0930 13:37:23.661277 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="extract-utilities" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661286 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="extract-utilities" Sep 30 13:37:23 crc kubenswrapper[4672]: E0930 13:37:23.661309 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="registry-server" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661316 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="registry-server" Sep 30 13:37:23 crc kubenswrapper[4672]: E0930 13:37:23.661566 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="extract-utilities" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661575 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="extract-utilities" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661790 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a72dc34-6098-4bff-83d2-fb4a8fbdeda2" containerName="registry-server" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.661809 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7eb386-d6a3-4cf0-9038-7cdafd4ada30" containerName="registry-server" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.663397 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.688996 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdqnv"] Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.736944 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdxmh\" (UniqueName: \"kubernetes.io/projected/d11b3b8f-291c-46af-9bb0-0644630d8c10-kube-api-access-jdxmh\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.736994 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-utilities\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.737052 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-catalog-content\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.839062 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-catalog-content\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.839250 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdxmh\" (UniqueName: \"kubernetes.io/projected/d11b3b8f-291c-46af-9bb0-0644630d8c10-kube-api-access-jdxmh\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.839300 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-utilities\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.839699 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-catalog-content\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.839764 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-utilities\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:23 crc kubenswrapper[4672]: I0930 13:37:23.857308 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdxmh\" (UniqueName: \"kubernetes.io/projected/d11b3b8f-291c-46af-9bb0-0644630d8c10-kube-api-access-jdxmh\") pod \"community-operators-fdqnv\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:24 crc kubenswrapper[4672]: I0930 13:37:24.012344 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:24 crc kubenswrapper[4672]: I0930 13:37:24.528726 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdqnv"] Sep 30 13:37:24 crc kubenswrapper[4672]: I0930 13:37:24.740198 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:37:24 crc kubenswrapper[4672]: I0930 13:37:24.740695 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:37:24 crc kubenswrapper[4672]: I0930 13:37:24.740766 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:37:24 crc kubenswrapper[4672]: I0930 13:37:24.741873 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:37:24 crc kubenswrapper[4672]: I0930 13:37:24.741954 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" gracePeriod=600 Sep 30 13:37:24 crc kubenswrapper[4672]: E0930 13:37:24.874836 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.511639 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" exitCode=0 Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.511721 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5"} Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.512063 4672 scope.go:117] "RemoveContainer" containerID="caf17acdef1b279f2d0357a079f11b366c5613a1cdad739f51759198b31aa0ff" Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.512827 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:37:25 crc kubenswrapper[4672]: E0930 13:37:25.513169 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.513976 4672 generic.go:334] "Generic (PLEG): container finished" podID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerID="91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a" exitCode=0 Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.514010 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdqnv" event={"ID":"d11b3b8f-291c-46af-9bb0-0644630d8c10","Type":"ContainerDied","Data":"91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a"} Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.514035 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdqnv" event={"ID":"d11b3b8f-291c-46af-9bb0-0644630d8c10","Type":"ContainerStarted","Data":"b3d4dc2db8ed12a7a2bdc6aeda4619fffa96dc41ff4d7b41ebf107297c582a23"} Sep 30 13:37:25 crc kubenswrapper[4672]: I0930 13:37:25.517700 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:37:26 crc kubenswrapper[4672]: I0930 13:37:26.534287 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdqnv" event={"ID":"d11b3b8f-291c-46af-9bb0-0644630d8c10","Type":"ContainerStarted","Data":"2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08"} Sep 30 13:37:27 crc kubenswrapper[4672]: I0930 13:37:27.546428 4672 generic.go:334] "Generic (PLEG): container finished" podID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerID="2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08" exitCode=0 Sep 30 13:37:27 crc kubenswrapper[4672]: I0930 13:37:27.546476 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdqnv" event={"ID":"d11b3b8f-291c-46af-9bb0-0644630d8c10","Type":"ContainerDied","Data":"2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08"} Sep 30 13:37:28 crc kubenswrapper[4672]: I0930 13:37:28.562854 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdqnv" event={"ID":"d11b3b8f-291c-46af-9bb0-0644630d8c10","Type":"ContainerStarted","Data":"57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852"} Sep 30 13:37:28 crc kubenswrapper[4672]: I0930 13:37:28.587139 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fdqnv" podStartSLOduration=3.16279637 podStartE2EDuration="5.587113133s" podCreationTimestamp="2025-09-30 13:37:23 +0000 UTC" firstStartedPulling="2025-09-30 13:37:25.517413313 +0000 UTC m=+4536.786650959" lastFinishedPulling="2025-09-30 13:37:27.941730056 +0000 UTC m=+4539.210967722" observedRunningTime="2025-09-30 13:37:28.5829962 +0000 UTC m=+4539.852233886" watchObservedRunningTime="2025-09-30 13:37:28.587113133 +0000 UTC m=+4539.856350809" Sep 30 13:37:34 crc kubenswrapper[4672]: I0930 13:37:34.013410 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:34 crc kubenswrapper[4672]: I0930 13:37:34.013909 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:34 crc kubenswrapper[4672]: I0930 13:37:34.198221 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:34 crc kubenswrapper[4672]: I0930 13:37:34.712909 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.068509 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdqnv"] Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.069399 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fdqnv" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="registry-server" containerID="cri-o://57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852" gracePeriod=2 Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.572968 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.679941 4672 generic.go:334] "Generic (PLEG): container finished" podID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerID="57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852" exitCode=0 Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.680013 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdqnv" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.680036 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdqnv" event={"ID":"d11b3b8f-291c-46af-9bb0-0644630d8c10","Type":"ContainerDied","Data":"57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852"} Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.681395 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdqnv" event={"ID":"d11b3b8f-291c-46af-9bb0-0644630d8c10","Type":"ContainerDied","Data":"b3d4dc2db8ed12a7a2bdc6aeda4619fffa96dc41ff4d7b41ebf107297c582a23"} Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.681479 4672 scope.go:117] "RemoveContainer" containerID="57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.691745 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdxmh\" (UniqueName: \"kubernetes.io/projected/d11b3b8f-291c-46af-9bb0-0644630d8c10-kube-api-access-jdxmh\") pod \"d11b3b8f-291c-46af-9bb0-0644630d8c10\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.691837 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-utilities\") pod \"d11b3b8f-291c-46af-9bb0-0644630d8c10\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.692022 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-catalog-content\") pod \"d11b3b8f-291c-46af-9bb0-0644630d8c10\" (UID: \"d11b3b8f-291c-46af-9bb0-0644630d8c10\") " Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.692633 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-utilities" (OuterVolumeSpecName: "utilities") pod "d11b3b8f-291c-46af-9bb0-0644630d8c10" (UID: "d11b3b8f-291c-46af-9bb0-0644630d8c10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.694408 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.698220 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11b3b8f-291c-46af-9bb0-0644630d8c10-kube-api-access-jdxmh" (OuterVolumeSpecName: "kube-api-access-jdxmh") pod "d11b3b8f-291c-46af-9bb0-0644630d8c10" (UID: "d11b3b8f-291c-46af-9bb0-0644630d8c10"). InnerVolumeSpecName "kube-api-access-jdxmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.706935 4672 scope.go:117] "RemoveContainer" containerID="2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.762555 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d11b3b8f-291c-46af-9bb0-0644630d8c10" (UID: "d11b3b8f-291c-46af-9bb0-0644630d8c10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.772320 4672 scope.go:117] "RemoveContainer" containerID="91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.796770 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdxmh\" (UniqueName: \"kubernetes.io/projected/d11b3b8f-291c-46af-9bb0-0644630d8c10-kube-api-access-jdxmh\") on node \"crc\" DevicePath \"\"" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.796806 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11b3b8f-291c-46af-9bb0-0644630d8c10-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.811060 4672 scope.go:117] "RemoveContainer" containerID="57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852" Sep 30 13:37:38 crc kubenswrapper[4672]: E0930 13:37:38.811532 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852\": container with ID starting with 57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852 not found: ID does not exist" containerID="57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.811574 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852"} err="failed to get container status \"57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852\": rpc error: code = NotFound desc = could not find container \"57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852\": container with ID starting with 57a874573b7f74c05f575f744d11d95418d8439f6af6de50811ce4b4dce23852 not found: ID does not exist" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.811600 4672 scope.go:117] "RemoveContainer" containerID="2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08" Sep 30 13:37:38 crc kubenswrapper[4672]: E0930 13:37:38.811912 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08\": container with ID starting with 2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08 not found: ID does not exist" containerID="2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.811943 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08"} err="failed to get container status \"2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08\": rpc error: code = NotFound desc = could not find container \"2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08\": container with ID starting with 2f7b2ad97d000dcd490ce3a5ca02c39fffbc80fee0690f899d889dcaf7712e08 not found: ID does not exist" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.811960 4672 scope.go:117] "RemoveContainer" containerID="91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a" Sep 30 13:37:38 crc kubenswrapper[4672]: E0930 13:37:38.812421 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a\": container with ID starting with 91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a not found: ID does not exist" containerID="91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a" Sep 30 13:37:38 crc kubenswrapper[4672]: I0930 13:37:38.812451 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a"} err="failed to get container status \"91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a\": rpc error: code = NotFound desc = could not find container \"91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a\": container with ID starting with 91a8a37d82b72676d4e241bdf70fcf4c21278f1a8b7cb9bd715f06a4ad74c94a not found: ID does not exist" Sep 30 13:37:39 crc kubenswrapper[4672]: I0930 13:37:39.047067 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdqnv"] Sep 30 13:37:39 crc kubenswrapper[4672]: I0930 13:37:39.065533 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fdqnv"] Sep 30 13:37:39 crc kubenswrapper[4672]: I0930 13:37:39.431586 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" path="/var/lib/kubelet/pods/d11b3b8f-291c-46af-9bb0-0644630d8c10/volumes" Sep 30 13:37:40 crc kubenswrapper[4672]: I0930 13:37:40.416563 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:37:40 crc kubenswrapper[4672]: E0930 13:37:40.416955 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:37:52 crc kubenswrapper[4672]: I0930 13:37:52.417888 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:37:52 crc kubenswrapper[4672]: E0930 13:37:52.419050 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:38:04 crc kubenswrapper[4672]: I0930 13:38:04.417599 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:38:04 crc kubenswrapper[4672]: E0930 13:38:04.418369 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:38:16 crc kubenswrapper[4672]: I0930 13:38:16.417112 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:38:16 crc kubenswrapper[4672]: E0930 13:38:16.418310 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:38:30 crc kubenswrapper[4672]: I0930 13:38:30.417233 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:38:30 crc kubenswrapper[4672]: E0930 13:38:30.418413 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:38:41 crc kubenswrapper[4672]: I0930 13:38:41.417678 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:38:41 crc kubenswrapper[4672]: E0930 13:38:41.418442 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:38:52 crc kubenswrapper[4672]: I0930 13:38:52.417971 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:38:52 crc kubenswrapper[4672]: E0930 13:38:52.418561 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:39:05 crc kubenswrapper[4672]: I0930 13:39:05.416718 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:39:05 crc kubenswrapper[4672]: E0930 13:39:05.417513 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:39:17 crc kubenswrapper[4672]: I0930 13:39:17.419509 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:39:17 crc kubenswrapper[4672]: E0930 13:39:17.420838 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:39:28 crc kubenswrapper[4672]: I0930 13:39:28.417396 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:39:28 crc kubenswrapper[4672]: E0930 13:39:28.418177 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:39:43 crc kubenswrapper[4672]: I0930 13:39:43.418171 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:39:43 crc kubenswrapper[4672]: E0930 13:39:43.419198 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:39:54 crc kubenswrapper[4672]: I0930 13:39:54.417040 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:39:54 crc kubenswrapper[4672]: E0930 13:39:54.417758 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:40:06 crc kubenswrapper[4672]: I0930 13:40:06.416828 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:40:06 crc kubenswrapper[4672]: E0930 13:40:06.417647 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:40:20 crc kubenswrapper[4672]: I0930 13:40:20.417375 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:40:20 crc kubenswrapper[4672]: E0930 13:40:20.418237 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:40:31 crc kubenswrapper[4672]: I0930 13:40:31.417169 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:40:31 crc kubenswrapper[4672]: E0930 13:40:31.419417 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:40:45 crc kubenswrapper[4672]: I0930 13:40:45.416824 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:40:45 crc kubenswrapper[4672]: E0930 13:40:45.417968 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:41:00 crc kubenswrapper[4672]: I0930 13:41:00.417666 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:41:00 crc kubenswrapper[4672]: E0930 13:41:00.418525 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:41:12 crc kubenswrapper[4672]: I0930 13:41:12.417395 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:41:12 crc kubenswrapper[4672]: E0930 13:41:12.418081 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:41:26 crc kubenswrapper[4672]: I0930 13:41:26.417246 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:41:26 crc kubenswrapper[4672]: E0930 13:41:26.417985 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:41:40 crc kubenswrapper[4672]: I0930 13:41:40.418294 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:41:40 crc kubenswrapper[4672]: E0930 13:41:40.419524 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:41:55 crc kubenswrapper[4672]: I0930 13:41:55.417299 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:41:55 crc kubenswrapper[4672]: E0930 13:41:55.418094 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:42:08 crc kubenswrapper[4672]: I0930 13:42:08.417606 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:42:08 crc kubenswrapper[4672]: E0930 13:42:08.418431 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:42:23 crc kubenswrapper[4672]: I0930 13:42:23.417224 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:42:23 crc kubenswrapper[4672]: E0930 13:42:23.418392 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:42:34 crc kubenswrapper[4672]: I0930 13:42:34.422799 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:42:35 crc kubenswrapper[4672]: I0930 13:42:35.858222 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"bad48423dd84ef618fb50b14346f14c1ae1019eff77694ab2ec75f0f9f1e7b3f"} Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.742211 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s864h"] Sep 30 13:42:45 crc kubenswrapper[4672]: E0930 13:42:45.743360 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="registry-server" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.743384 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="registry-server" Sep 30 13:42:45 crc kubenswrapper[4672]: E0930 13:42:45.743412 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="extract-content" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.743425 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="extract-content" Sep 30 13:42:45 crc kubenswrapper[4672]: E0930 13:42:45.743443 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="extract-utilities" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.743450 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="extract-utilities" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.743733 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11b3b8f-291c-46af-9bb0-0644630d8c10" containerName="registry-server" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.745361 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.754275 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s864h"] Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.793798 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-catalog-content\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.793911 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcvvx\" (UniqueName: \"kubernetes.io/projected/1feb5255-7d94-4656-ae15-ac7334312a86-kube-api-access-pcvvx\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.793989 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-utilities\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.896193 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-utilities\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.896320 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-catalog-content\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.896381 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcvvx\" (UniqueName: \"kubernetes.io/projected/1feb5255-7d94-4656-ae15-ac7334312a86-kube-api-access-pcvvx\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.897044 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-utilities\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.897244 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-catalog-content\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:45 crc kubenswrapper[4672]: I0930 13:42:45.916363 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcvvx\" (UniqueName: \"kubernetes.io/projected/1feb5255-7d94-4656-ae15-ac7334312a86-kube-api-access-pcvvx\") pod \"redhat-operators-s864h\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:46 crc kubenswrapper[4672]: I0930 13:42:46.066163 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:46 crc kubenswrapper[4672]: I0930 13:42:46.555035 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s864h"] Sep 30 13:42:46 crc kubenswrapper[4672]: W0930 13:42:46.559502 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1feb5255_7d94_4656_ae15_ac7334312a86.slice/crio-5b49fd8750860b55313a350448bab87ebe91300a86423fc1f408a0060a13c4a8 WatchSource:0}: Error finding container 5b49fd8750860b55313a350448bab87ebe91300a86423fc1f408a0060a13c4a8: Status 404 returned error can't find the container with id 5b49fd8750860b55313a350448bab87ebe91300a86423fc1f408a0060a13c4a8 Sep 30 13:42:46 crc kubenswrapper[4672]: I0930 13:42:46.973197 4672 generic.go:334] "Generic (PLEG): container finished" podID="1feb5255-7d94-4656-ae15-ac7334312a86" containerID="5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4" exitCode=0 Sep 30 13:42:46 crc kubenswrapper[4672]: I0930 13:42:46.973318 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s864h" event={"ID":"1feb5255-7d94-4656-ae15-ac7334312a86","Type":"ContainerDied","Data":"5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4"} Sep 30 13:42:46 crc kubenswrapper[4672]: I0930 13:42:46.974173 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s864h" event={"ID":"1feb5255-7d94-4656-ae15-ac7334312a86","Type":"ContainerStarted","Data":"5b49fd8750860b55313a350448bab87ebe91300a86423fc1f408a0060a13c4a8"} Sep 30 13:42:46 crc kubenswrapper[4672]: I0930 13:42:46.975332 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:42:48 crc kubenswrapper[4672]: I0930 13:42:48.997745 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s864h" event={"ID":"1feb5255-7d94-4656-ae15-ac7334312a86","Type":"ContainerStarted","Data":"cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617"} Sep 30 13:42:51 crc kubenswrapper[4672]: I0930 13:42:51.020074 4672 generic.go:334] "Generic (PLEG): container finished" podID="1feb5255-7d94-4656-ae15-ac7334312a86" containerID="cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617" exitCode=0 Sep 30 13:42:51 crc kubenswrapper[4672]: I0930 13:42:51.020181 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s864h" event={"ID":"1feb5255-7d94-4656-ae15-ac7334312a86","Type":"ContainerDied","Data":"cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617"} Sep 30 13:42:52 crc kubenswrapper[4672]: I0930 13:42:52.032638 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s864h" event={"ID":"1feb5255-7d94-4656-ae15-ac7334312a86","Type":"ContainerStarted","Data":"778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282"} Sep 30 13:42:52 crc kubenswrapper[4672]: I0930 13:42:52.053890 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s864h" podStartSLOduration=2.22304635 podStartE2EDuration="7.053875259s" podCreationTimestamp="2025-09-30 13:42:45 +0000 UTC" firstStartedPulling="2025-09-30 13:42:46.975134113 +0000 UTC m=+4858.244371759" lastFinishedPulling="2025-09-30 13:42:51.805963022 +0000 UTC m=+4863.075200668" observedRunningTime="2025-09-30 13:42:52.049838578 +0000 UTC m=+4863.319076224" watchObservedRunningTime="2025-09-30 13:42:52.053875259 +0000 UTC m=+4863.323112905" Sep 30 13:42:56 crc kubenswrapper[4672]: I0930 13:42:56.066918 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:56 crc kubenswrapper[4672]: I0930 13:42:56.067489 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:42:57 crc kubenswrapper[4672]: I0930 13:42:57.388775 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s864h" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="registry-server" probeResult="failure" output=< Sep 30 13:42:57 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Sep 30 13:42:57 crc kubenswrapper[4672]: > Sep 30 13:43:06 crc kubenswrapper[4672]: I0930 13:43:06.122527 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:43:06 crc kubenswrapper[4672]: I0930 13:43:06.178879 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:43:06 crc kubenswrapper[4672]: I0930 13:43:06.358372 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s864h"] Sep 30 13:43:07 crc kubenswrapper[4672]: I0930 13:43:07.184343 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s864h" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="registry-server" containerID="cri-o://778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282" gracePeriod=2 Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.159481 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.199920 4672 generic.go:334] "Generic (PLEG): container finished" podID="1feb5255-7d94-4656-ae15-ac7334312a86" containerID="778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282" exitCode=0 Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.199956 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s864h" event={"ID":"1feb5255-7d94-4656-ae15-ac7334312a86","Type":"ContainerDied","Data":"778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282"} Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.199990 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s864h" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.199999 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s864h" event={"ID":"1feb5255-7d94-4656-ae15-ac7334312a86","Type":"ContainerDied","Data":"5b49fd8750860b55313a350448bab87ebe91300a86423fc1f408a0060a13c4a8"} Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.200015 4672 scope.go:117] "RemoveContainer" containerID="778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.221029 4672 scope.go:117] "RemoveContainer" containerID="cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.263560 4672 scope.go:117] "RemoveContainer" containerID="5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.301903 4672 scope.go:117] "RemoveContainer" containerID="778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282" Sep 30 13:43:08 crc kubenswrapper[4672]: E0930 13:43:08.302349 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282\": container with ID starting with 778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282 not found: ID does not exist" containerID="778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.302386 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282"} err="failed to get container status \"778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282\": rpc error: code = NotFound desc = could not find container \"778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282\": container with ID starting with 778aa7c163abbf1d700dab299c9dbab65e490f2941e90edcde43ebb1f9a5c282 not found: ID does not exist" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.302411 4672 scope.go:117] "RemoveContainer" containerID="cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617" Sep 30 13:43:08 crc kubenswrapper[4672]: E0930 13:43:08.302849 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617\": container with ID starting with cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617 not found: ID does not exist" containerID="cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.302892 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617"} err="failed to get container status \"cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617\": rpc error: code = NotFound desc = could not find container \"cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617\": container with ID starting with cda7cdb8ba8389d05a21c3901e9e6b9e20d315b2be36826eadc521826d8dc617 not found: ID does not exist" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.302919 4672 scope.go:117] "RemoveContainer" containerID="5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4" Sep 30 13:43:08 crc kubenswrapper[4672]: E0930 13:43:08.303219 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4\": container with ID starting with 5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4 not found: ID does not exist" containerID="5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.303244 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4"} err="failed to get container status \"5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4\": rpc error: code = NotFound desc = could not find container \"5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4\": container with ID starting with 5946821ad40ff3b668ba449d456f13205e1f03a3a4d0cec7773a08e0ad8f1be4 not found: ID does not exist" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.336724 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-catalog-content\") pod \"1feb5255-7d94-4656-ae15-ac7334312a86\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.336953 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-utilities\") pod \"1feb5255-7d94-4656-ae15-ac7334312a86\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.337093 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcvvx\" (UniqueName: \"kubernetes.io/projected/1feb5255-7d94-4656-ae15-ac7334312a86-kube-api-access-pcvvx\") pod \"1feb5255-7d94-4656-ae15-ac7334312a86\" (UID: \"1feb5255-7d94-4656-ae15-ac7334312a86\") " Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.338942 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-utilities" (OuterVolumeSpecName: "utilities") pod "1feb5255-7d94-4656-ae15-ac7334312a86" (UID: "1feb5255-7d94-4656-ae15-ac7334312a86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.342922 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1feb5255-7d94-4656-ae15-ac7334312a86-kube-api-access-pcvvx" (OuterVolumeSpecName: "kube-api-access-pcvvx") pod "1feb5255-7d94-4656-ae15-ac7334312a86" (UID: "1feb5255-7d94-4656-ae15-ac7334312a86"). InnerVolumeSpecName "kube-api-access-pcvvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.421089 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1feb5255-7d94-4656-ae15-ac7334312a86" (UID: "1feb5255-7d94-4656-ae15-ac7334312a86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.439367 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.439408 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcvvx\" (UniqueName: \"kubernetes.io/projected/1feb5255-7d94-4656-ae15-ac7334312a86-kube-api-access-pcvvx\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.439422 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1feb5255-7d94-4656-ae15-ac7334312a86-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.534192 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s864h"] Sep 30 13:43:08 crc kubenswrapper[4672]: I0930 13:43:08.545454 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s864h"] Sep 30 13:43:09 crc kubenswrapper[4672]: I0930 13:43:09.427847 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" path="/var/lib/kubelet/pods/1feb5255-7d94-4656-ae15-ac7334312a86/volumes" Sep 30 13:44:54 crc kubenswrapper[4672]: I0930 13:44:54.740082 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:44:54 crc kubenswrapper[4672]: I0930 13:44:54.740612 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.151947 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d"] Sep 30 13:45:00 crc kubenswrapper[4672]: E0930 13:45:00.152792 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="extract-utilities" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.152804 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="extract-utilities" Sep 30 13:45:00 crc kubenswrapper[4672]: E0930 13:45:00.152837 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="extract-content" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.152843 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="extract-content" Sep 30 13:45:00 crc kubenswrapper[4672]: E0930 13:45:00.152853 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="registry-server" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.152859 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="registry-server" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.153035 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="1feb5255-7d94-4656-ae15-ac7334312a86" containerName="registry-server" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.153733 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.156279 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.157136 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.169075 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d"] Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.274933 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm25l\" (UniqueName: \"kubernetes.io/projected/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-kube-api-access-jm25l\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.275111 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-config-volume\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.275281 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-secret-volume\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.376761 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-config-volume\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.376897 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-secret-volume\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.376992 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm25l\" (UniqueName: \"kubernetes.io/projected/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-kube-api-access-jm25l\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.377759 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-config-volume\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.386763 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-secret-volume\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.437651 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm25l\" (UniqueName: \"kubernetes.io/projected/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-kube-api-access-jm25l\") pod \"collect-profiles-29320665-c4h5d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.486570 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:00 crc kubenswrapper[4672]: I0930 13:45:00.984383 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d"] Sep 30 13:45:01 crc kubenswrapper[4672]: I0930 13:45:01.303828 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" event={"ID":"d6bf08bb-c279-4ce2-ac7a-005534ffb69d","Type":"ContainerStarted","Data":"a48c74a4b000dd79822bb5607efee7c7e8fa3bf7d507176c6f45b65b5261a4bc"} Sep 30 13:45:01 crc kubenswrapper[4672]: I0930 13:45:01.303888 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" event={"ID":"d6bf08bb-c279-4ce2-ac7a-005534ffb69d","Type":"ContainerStarted","Data":"6aee0be229bd1624435c9fa86cc2d9b424bf9d33d618da7e73c0793066b70a86"} Sep 30 13:45:01 crc kubenswrapper[4672]: I0930 13:45:01.325990 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" podStartSLOduration=1.325972161 podStartE2EDuration="1.325972161s" podCreationTimestamp="2025-09-30 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:45:01.32002926 +0000 UTC m=+4992.589266896" watchObservedRunningTime="2025-09-30 13:45:01.325972161 +0000 UTC m=+4992.595209807" Sep 30 13:45:02 crc kubenswrapper[4672]: I0930 13:45:02.324016 4672 generic.go:334] "Generic (PLEG): container finished" podID="d6bf08bb-c279-4ce2-ac7a-005534ffb69d" containerID="a48c74a4b000dd79822bb5607efee7c7e8fa3bf7d507176c6f45b65b5261a4bc" exitCode=0 Sep 30 13:45:02 crc kubenswrapper[4672]: I0930 13:45:02.324395 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" event={"ID":"d6bf08bb-c279-4ce2-ac7a-005534ffb69d","Type":"ContainerDied","Data":"a48c74a4b000dd79822bb5607efee7c7e8fa3bf7d507176c6f45b65b5261a4bc"} Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.691630 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.861853 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm25l\" (UniqueName: \"kubernetes.io/projected/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-kube-api-access-jm25l\") pod \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.861912 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-secret-volume\") pod \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.862119 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-config-volume\") pod \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\" (UID: \"d6bf08bb-c279-4ce2-ac7a-005534ffb69d\") " Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.862697 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-config-volume" (OuterVolumeSpecName: "config-volume") pod "d6bf08bb-c279-4ce2-ac7a-005534ffb69d" (UID: "d6bf08bb-c279-4ce2-ac7a-005534ffb69d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.862832 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.869104 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d6bf08bb-c279-4ce2-ac7a-005534ffb69d" (UID: "d6bf08bb-c279-4ce2-ac7a-005534ffb69d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.869181 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-kube-api-access-jm25l" (OuterVolumeSpecName: "kube-api-access-jm25l") pod "d6bf08bb-c279-4ce2-ac7a-005534ffb69d" (UID: "d6bf08bb-c279-4ce2-ac7a-005534ffb69d"). InnerVolumeSpecName "kube-api-access-jm25l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.964355 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm25l\" (UniqueName: \"kubernetes.io/projected/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-kube-api-access-jm25l\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:03 crc kubenswrapper[4672]: I0930 13:45:03.964402 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6bf08bb-c279-4ce2-ac7a-005534ffb69d-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:04 crc kubenswrapper[4672]: I0930 13:45:04.353896 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" event={"ID":"d6bf08bb-c279-4ce2-ac7a-005534ffb69d","Type":"ContainerDied","Data":"6aee0be229bd1624435c9fa86cc2d9b424bf9d33d618da7e73c0793066b70a86"} Sep 30 13:45:04 crc kubenswrapper[4672]: I0930 13:45:04.353921 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320665-c4h5d" Sep 30 13:45:04 crc kubenswrapper[4672]: I0930 13:45:04.353934 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aee0be229bd1624435c9fa86cc2d9b424bf9d33d618da7e73c0793066b70a86" Sep 30 13:45:04 crc kubenswrapper[4672]: I0930 13:45:04.395945 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2"] Sep 30 13:45:04 crc kubenswrapper[4672]: I0930 13:45:04.405868 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320620-lknc2"] Sep 30 13:45:05 crc kubenswrapper[4672]: I0930 13:45:05.428959 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f77c7a-8774-40ea-87ad-faefa20f03b8" path="/var/lib/kubelet/pods/81f77c7a-8774-40ea-87ad-faefa20f03b8/volumes" Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.782519 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5wgr9"] Sep 30 13:45:15 crc kubenswrapper[4672]: E0930 13:45:15.783474 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bf08bb-c279-4ce2-ac7a-005534ffb69d" containerName="collect-profiles" Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.783487 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bf08bb-c279-4ce2-ac7a-005534ffb69d" containerName="collect-profiles" Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.783719 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bf08bb-c279-4ce2-ac7a-005534ffb69d" containerName="collect-profiles" Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.785243 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.795223 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wgr9"] Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.910530 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-utilities\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.910594 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clx9w\" (UniqueName: \"kubernetes.io/projected/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-kube-api-access-clx9w\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:15 crc kubenswrapper[4672]: I0930 13:45:15.910982 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-catalog-content\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.012680 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-catalog-content\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.012849 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-utilities\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.012909 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clx9w\" (UniqueName: \"kubernetes.io/projected/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-kube-api-access-clx9w\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.013316 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-catalog-content\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.013386 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-utilities\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.035011 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clx9w\" (UniqueName: \"kubernetes.io/projected/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-kube-api-access-clx9w\") pod \"redhat-marketplace-5wgr9\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.107772 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:16 crc kubenswrapper[4672]: I0930 13:45:16.581212 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wgr9"] Sep 30 13:45:17 crc kubenswrapper[4672]: I0930 13:45:17.493206 4672 generic.go:334] "Generic (PLEG): container finished" podID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerID="dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b" exitCode=0 Sep 30 13:45:17 crc kubenswrapper[4672]: I0930 13:45:17.493282 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wgr9" event={"ID":"83617f4e-336d-408f-b3e5-8b7eb08ae7a5","Type":"ContainerDied","Data":"dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b"} Sep 30 13:45:17 crc kubenswrapper[4672]: I0930 13:45:17.493515 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wgr9" event={"ID":"83617f4e-336d-408f-b3e5-8b7eb08ae7a5","Type":"ContainerStarted","Data":"6502d116a5f920ffa7501f6f96ca5598ace9bf0b260dc86082cb83c0042584dc"} Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.182501 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mcv6f"] Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.184822 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.200826 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mcv6f"] Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.267404 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-catalog-content\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.267489 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-utilities\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.267697 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2mg\" (UniqueName: \"kubernetes.io/projected/77a4227f-7343-4356-9bec-9a66e7066edb-kube-api-access-7x2mg\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.369636 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-catalog-content\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.369700 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-utilities\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.369793 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2mg\" (UniqueName: \"kubernetes.io/projected/77a4227f-7343-4356-9bec-9a66e7066edb-kube-api-access-7x2mg\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.370407 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-utilities\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.370482 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-catalog-content\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.392421 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2mg\" (UniqueName: \"kubernetes.io/projected/77a4227f-7343-4356-9bec-9a66e7066edb-kube-api-access-7x2mg\") pod \"certified-operators-mcv6f\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:18 crc kubenswrapper[4672]: I0930 13:45:18.535393 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:19 crc kubenswrapper[4672]: I0930 13:45:19.075502 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mcv6f"] Sep 30 13:45:19 crc kubenswrapper[4672]: I0930 13:45:19.512793 4672 generic.go:334] "Generic (PLEG): container finished" podID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerID="93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d" exitCode=0 Sep 30 13:45:19 crc kubenswrapper[4672]: I0930 13:45:19.513210 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wgr9" event={"ID":"83617f4e-336d-408f-b3e5-8b7eb08ae7a5","Type":"ContainerDied","Data":"93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d"} Sep 30 13:45:19 crc kubenswrapper[4672]: I0930 13:45:19.514750 4672 generic.go:334] "Generic (PLEG): container finished" podID="77a4227f-7343-4356-9bec-9a66e7066edb" containerID="7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367" exitCode=0 Sep 30 13:45:19 crc kubenswrapper[4672]: I0930 13:45:19.514793 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcv6f" event={"ID":"77a4227f-7343-4356-9bec-9a66e7066edb","Type":"ContainerDied","Data":"7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367"} Sep 30 13:45:19 crc kubenswrapper[4672]: I0930 13:45:19.514818 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcv6f" event={"ID":"77a4227f-7343-4356-9bec-9a66e7066edb","Type":"ContainerStarted","Data":"c825d9384f5f541cf314a82d06baa316c1b42664d832061129752ee9861485ef"} Sep 30 13:45:20 crc kubenswrapper[4672]: I0930 13:45:20.525527 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wgr9" event={"ID":"83617f4e-336d-408f-b3e5-8b7eb08ae7a5","Type":"ContainerStarted","Data":"26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd"} Sep 30 13:45:20 crc kubenswrapper[4672]: I0930 13:45:20.529615 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcv6f" event={"ID":"77a4227f-7343-4356-9bec-9a66e7066edb","Type":"ContainerStarted","Data":"28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791"} Sep 30 13:45:20 crc kubenswrapper[4672]: I0930 13:45:20.545912 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5wgr9" podStartSLOduration=2.788634152 podStartE2EDuration="5.545890371s" podCreationTimestamp="2025-09-30 13:45:15 +0000 UTC" firstStartedPulling="2025-09-30 13:45:17.495047286 +0000 UTC m=+5008.764284932" lastFinishedPulling="2025-09-30 13:45:20.252303505 +0000 UTC m=+5011.521541151" observedRunningTime="2025-09-30 13:45:20.542590877 +0000 UTC m=+5011.811828533" watchObservedRunningTime="2025-09-30 13:45:20.545890371 +0000 UTC m=+5011.815128017" Sep 30 13:45:22 crc kubenswrapper[4672]: I0930 13:45:22.553024 4672 generic.go:334] "Generic (PLEG): container finished" podID="77a4227f-7343-4356-9bec-9a66e7066edb" containerID="28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791" exitCode=0 Sep 30 13:45:22 crc kubenswrapper[4672]: I0930 13:45:22.553395 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcv6f" event={"ID":"77a4227f-7343-4356-9bec-9a66e7066edb","Type":"ContainerDied","Data":"28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791"} Sep 30 13:45:23 crc kubenswrapper[4672]: I0930 13:45:23.566154 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcv6f" event={"ID":"77a4227f-7343-4356-9bec-9a66e7066edb","Type":"ContainerStarted","Data":"1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf"} Sep 30 13:45:23 crc kubenswrapper[4672]: I0930 13:45:23.594504 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mcv6f" podStartSLOduration=1.920304322 podStartE2EDuration="5.594482428s" podCreationTimestamp="2025-09-30 13:45:18 +0000 UTC" firstStartedPulling="2025-09-30 13:45:19.515910835 +0000 UTC m=+5010.785148481" lastFinishedPulling="2025-09-30 13:45:23.190088941 +0000 UTC m=+5014.459326587" observedRunningTime="2025-09-30 13:45:23.593836801 +0000 UTC m=+5014.863074457" watchObservedRunningTime="2025-09-30 13:45:23.594482428 +0000 UTC m=+5014.863720074" Sep 30 13:45:24 crc kubenswrapper[4672]: I0930 13:45:24.739455 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:45:24 crc kubenswrapper[4672]: I0930 13:45:24.739842 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:26 crc kubenswrapper[4672]: I0930 13:45:26.108825 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:26 crc kubenswrapper[4672]: I0930 13:45:26.108876 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:26 crc kubenswrapper[4672]: I0930 13:45:26.167676 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:26 crc kubenswrapper[4672]: I0930 13:45:26.635438 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:27 crc kubenswrapper[4672]: I0930 13:45:27.177240 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wgr9"] Sep 30 13:45:28 crc kubenswrapper[4672]: I0930 13:45:28.535963 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:28 crc kubenswrapper[4672]: I0930 13:45:28.536359 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:28 crc kubenswrapper[4672]: I0930 13:45:28.617229 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5wgr9" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="registry-server" containerID="cri-o://26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd" gracePeriod=2 Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.120385 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.218355 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clx9w\" (UniqueName: \"kubernetes.io/projected/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-kube-api-access-clx9w\") pod \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.218493 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-utilities\") pod \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.219316 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-utilities" (OuterVolumeSpecName: "utilities") pod "83617f4e-336d-408f-b3e5-8b7eb08ae7a5" (UID: "83617f4e-336d-408f-b3e5-8b7eb08ae7a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.220058 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-catalog-content\") pod \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\" (UID: \"83617f4e-336d-408f-b3e5-8b7eb08ae7a5\") " Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.220982 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.231624 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83617f4e-336d-408f-b3e5-8b7eb08ae7a5" (UID: "83617f4e-336d-408f-b3e5-8b7eb08ae7a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.233525 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-kube-api-access-clx9w" (OuterVolumeSpecName: "kube-api-access-clx9w") pod "83617f4e-336d-408f-b3e5-8b7eb08ae7a5" (UID: "83617f4e-336d-408f-b3e5-8b7eb08ae7a5"). InnerVolumeSpecName "kube-api-access-clx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.346633 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.346687 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clx9w\" (UniqueName: \"kubernetes.io/projected/83617f4e-336d-408f-b3e5-8b7eb08ae7a5-kube-api-access-clx9w\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.583443 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mcv6f" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="registry-server" probeResult="failure" output=< Sep 30 13:45:29 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Sep 30 13:45:29 crc kubenswrapper[4672]: > Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.628327 4672 generic.go:334] "Generic (PLEG): container finished" podID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerID="26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd" exitCode=0 Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.628407 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wgr9" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.628379 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wgr9" event={"ID":"83617f4e-336d-408f-b3e5-8b7eb08ae7a5","Type":"ContainerDied","Data":"26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd"} Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.628556 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wgr9" event={"ID":"83617f4e-336d-408f-b3e5-8b7eb08ae7a5","Type":"ContainerDied","Data":"6502d116a5f920ffa7501f6f96ca5598ace9bf0b260dc86082cb83c0042584dc"} Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.628590 4672 scope.go:117] "RemoveContainer" containerID="26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.661521 4672 scope.go:117] "RemoveContainer" containerID="93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.662186 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wgr9"] Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.672769 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wgr9"] Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.687781 4672 scope.go:117] "RemoveContainer" containerID="dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.734804 4672 scope.go:117] "RemoveContainer" containerID="26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd" Sep 30 13:45:29 crc kubenswrapper[4672]: E0930 13:45:29.735154 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd\": container with ID starting with 26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd not found: ID does not exist" containerID="26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.735194 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd"} err="failed to get container status \"26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd\": rpc error: code = NotFound desc = could not find container \"26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd\": container with ID starting with 26b74d607861235d2a4084425f8aa2a1a0c4afb290df266fc7a21ab458f3d4dd not found: ID does not exist" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.735218 4672 scope.go:117] "RemoveContainer" containerID="93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d" Sep 30 13:45:29 crc kubenswrapper[4672]: E0930 13:45:29.735638 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d\": container with ID starting with 93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d not found: ID does not exist" containerID="93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.735674 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d"} err="failed to get container status \"93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d\": rpc error: code = NotFound desc = could not find container \"93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d\": container with ID starting with 93ca526b641e166ebc8589f1af45584497590960f2c20d64415e014b7e31550d not found: ID does not exist" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.735693 4672 scope.go:117] "RemoveContainer" containerID="dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b" Sep 30 13:45:29 crc kubenswrapper[4672]: E0930 13:45:29.735907 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b\": container with ID starting with dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b not found: ID does not exist" containerID="dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b" Sep 30 13:45:29 crc kubenswrapper[4672]: I0930 13:45:29.735925 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b"} err="failed to get container status \"dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b\": rpc error: code = NotFound desc = could not find container \"dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b\": container with ID starting with dc101a54fcf2973e30de8640523d889db127ef30ade2214655c72337ee434c3b not found: ID does not exist" Sep 30 13:45:31 crc kubenswrapper[4672]: I0930 13:45:31.429357 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" path="/var/lib/kubelet/pods/83617f4e-336d-408f-b3e5-8b7eb08ae7a5/volumes" Sep 30 13:45:38 crc kubenswrapper[4672]: I0930 13:45:38.596480 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:38 crc kubenswrapper[4672]: I0930 13:45:38.649227 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:38 crc kubenswrapper[4672]: I0930 13:45:38.835463 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mcv6f"] Sep 30 13:45:39 crc kubenswrapper[4672]: I0930 13:45:39.722225 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mcv6f" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="registry-server" containerID="cri-o://1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf" gracePeriod=2 Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.246369 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.247937 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-utilities\") pod \"77a4227f-7343-4356-9bec-9a66e7066edb\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.247994 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-catalog-content\") pod \"77a4227f-7343-4356-9bec-9a66e7066edb\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.248146 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x2mg\" (UniqueName: \"kubernetes.io/projected/77a4227f-7343-4356-9bec-9a66e7066edb-kube-api-access-7x2mg\") pod \"77a4227f-7343-4356-9bec-9a66e7066edb\" (UID: \"77a4227f-7343-4356-9bec-9a66e7066edb\") " Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.249364 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-utilities" (OuterVolumeSpecName: "utilities") pod "77a4227f-7343-4356-9bec-9a66e7066edb" (UID: "77a4227f-7343-4356-9bec-9a66e7066edb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.253785 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a4227f-7343-4356-9bec-9a66e7066edb-kube-api-access-7x2mg" (OuterVolumeSpecName: "kube-api-access-7x2mg") pod "77a4227f-7343-4356-9bec-9a66e7066edb" (UID: "77a4227f-7343-4356-9bec-9a66e7066edb"). InnerVolumeSpecName "kube-api-access-7x2mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.306452 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77a4227f-7343-4356-9bec-9a66e7066edb" (UID: "77a4227f-7343-4356-9bec-9a66e7066edb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.350219 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x2mg\" (UniqueName: \"kubernetes.io/projected/77a4227f-7343-4356-9bec-9a66e7066edb-kube-api-access-7x2mg\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.350257 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.350289 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77a4227f-7343-4356-9bec-9a66e7066edb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.737052 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mcv6f" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.737144 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcv6f" event={"ID":"77a4227f-7343-4356-9bec-9a66e7066edb","Type":"ContainerDied","Data":"1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf"} Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.737218 4672 scope.go:117] "RemoveContainer" containerID="1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.736930 4672 generic.go:334] "Generic (PLEG): container finished" podID="77a4227f-7343-4356-9bec-9a66e7066edb" containerID="1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf" exitCode=0 Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.737606 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mcv6f" event={"ID":"77a4227f-7343-4356-9bec-9a66e7066edb","Type":"ContainerDied","Data":"c825d9384f5f541cf314a82d06baa316c1b42664d832061129752ee9861485ef"} Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.759533 4672 scope.go:117] "RemoveContainer" containerID="28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.782087 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mcv6f"] Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.794302 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mcv6f"] Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.800379 4672 scope.go:117] "RemoveContainer" containerID="7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.843124 4672 scope.go:117] "RemoveContainer" containerID="1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf" Sep 30 13:45:40 crc kubenswrapper[4672]: E0930 13:45:40.844296 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf\": container with ID starting with 1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf not found: ID does not exist" containerID="1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.844351 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf"} err="failed to get container status \"1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf\": rpc error: code = NotFound desc = could not find container \"1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf\": container with ID starting with 1b7df34c9b3502a43b92a1bffb33c35782e8732dca5a3dc3322f70093161adcf not found: ID does not exist" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.844377 4672 scope.go:117] "RemoveContainer" containerID="28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791" Sep 30 13:45:40 crc kubenswrapper[4672]: E0930 13:45:40.844691 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791\": container with ID starting with 28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791 not found: ID does not exist" containerID="28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.844728 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791"} err="failed to get container status \"28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791\": rpc error: code = NotFound desc = could not find container \"28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791\": container with ID starting with 28d7dbc5e45f4254683e0e1a4b29333746790bae078def6fb34e267b79276791 not found: ID does not exist" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.844748 4672 scope.go:117] "RemoveContainer" containerID="7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367" Sep 30 13:45:40 crc kubenswrapper[4672]: E0930 13:45:40.845328 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367\": container with ID starting with 7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367 not found: ID does not exist" containerID="7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367" Sep 30 13:45:40 crc kubenswrapper[4672]: I0930 13:45:40.845382 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367"} err="failed to get container status \"7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367\": rpc error: code = NotFound desc = could not find container \"7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367\": container with ID starting with 7df0122613a0166034f4f744c813134978ef11cb24233ed3719fef4db8026367 not found: ID does not exist" Sep 30 13:45:41 crc kubenswrapper[4672]: I0930 13:45:41.433834 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" path="/var/lib/kubelet/pods/77a4227f-7343-4356-9bec-9a66e7066edb/volumes" Sep 30 13:45:51 crc kubenswrapper[4672]: I0930 13:45:51.868701 4672 generic.go:334] "Generic (PLEG): container finished" podID="42b7a077-06bd-4f39-a1b7-e4692592ae68" containerID="ef890aecfd1d96f3def0725faee9450beb2714b3c6159edf916b6c99964c5054" exitCode=0 Sep 30 13:45:51 crc kubenswrapper[4672]: I0930 13:45:51.868800 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"42b7a077-06bd-4f39-a1b7-e4692592ae68","Type":"ContainerDied","Data":"ef890aecfd1d96f3def0725faee9450beb2714b3c6159edf916b6c99964c5054"} Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.260841 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.409491 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.409763 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hwks\" (UniqueName: \"kubernetes.io/projected/42b7a077-06bd-4f39-a1b7-e4692592ae68-kube-api-access-6hwks\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.409789 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ca-certs\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.409807 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ssh-key\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.409865 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-workdir\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.409899 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.410091 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config-secret\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.410142 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-temporary\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.410164 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-config-data\") pod \"42b7a077-06bd-4f39-a1b7-e4692592ae68\" (UID: \"42b7a077-06bd-4f39-a1b7-e4692592ae68\") " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.411518 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-config-data" (OuterVolumeSpecName: "config-data") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.411610 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.415530 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.416020 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.416078 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b7a077-06bd-4f39-a1b7-e4692592ae68-kube-api-access-6hwks" (OuterVolumeSpecName: "kube-api-access-6hwks") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "kube-api-access-6hwks". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.441874 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.453838 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.470595 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.483645 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "42b7a077-06bd-4f39-a1b7-e4692592ae68" (UID: "42b7a077-06bd-4f39-a1b7-e4692592ae68"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512389 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512433 4672 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512447 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512458 4672 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/42b7a077-06bd-4f39-a1b7-e4692592ae68-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512469 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hwks\" (UniqueName: \"kubernetes.io/projected/42b7a077-06bd-4f39-a1b7-e4692592ae68-kube-api-access-6hwks\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512480 4672 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512490 4672 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42b7a077-06bd-4f39-a1b7-e4692592ae68-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512501 4672 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/42b7a077-06bd-4f39-a1b7-e4692592ae68-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.512558 4672 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.533661 4672 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.620337 4672 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.889627 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"42b7a077-06bd-4f39-a1b7-e4692592ae68","Type":"ContainerDied","Data":"ff3f30aa39fdbee3a4f0f985bd39ed8017e9e0aaadbdb57d586dbd2b1b99f9ae"} Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.889665 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff3f30aa39fdbee3a4f0f985bd39ed8017e9e0aaadbdb57d586dbd2b1b99f9ae" Sep 30 13:45:53 crc kubenswrapper[4672]: I0930 13:45:53.889709 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.740369 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.740809 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.740874 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.742150 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bad48423dd84ef618fb50b14346f14c1ae1019eff77694ab2ec75f0f9f1e7b3f"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.742326 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://bad48423dd84ef618fb50b14346f14c1ae1019eff77694ab2ec75f0f9f1e7b3f" gracePeriod=600 Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.909714 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="bad48423dd84ef618fb50b14346f14c1ae1019eff77694ab2ec75f0f9f1e7b3f" exitCode=0 Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.909760 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"bad48423dd84ef618fb50b14346f14c1ae1019eff77694ab2ec75f0f9f1e7b3f"} Sep 30 13:45:54 crc kubenswrapper[4672]: I0930 13:45:54.909796 4672 scope.go:117] "RemoveContainer" containerID="823b2a4e038cbf7ef9e9b3b9d0fd9de4f8d57f966df9a5a9a851e18516deb7c5" Sep 30 13:45:55 crc kubenswrapper[4672]: I0930 13:45:55.925026 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59"} Sep 30 13:46:03 crc kubenswrapper[4672]: I0930 13:46:03.053376 4672 scope.go:117] "RemoveContainer" containerID="d80b3db4a416dd0f18afe0d63e2ccfb72e2af28fbf229c1d062e7e22d2091935" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.453598 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 13:46:06 crc kubenswrapper[4672]: E0930 13:46:06.455354 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="extract-content" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.455382 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="extract-content" Sep 30 13:46:06 crc kubenswrapper[4672]: E0930 13:46:06.455416 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b7a077-06bd-4f39-a1b7-e4692592ae68" containerName="tempest-tests-tempest-tests-runner" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.455429 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b7a077-06bd-4f39-a1b7-e4692592ae68" containerName="tempest-tests-tempest-tests-runner" Sep 30 13:46:06 crc kubenswrapper[4672]: E0930 13:46:06.455458 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="registry-server" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.455472 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="registry-server" Sep 30 13:46:06 crc kubenswrapper[4672]: E0930 13:46:06.455514 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="extract-utilities" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.455526 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="extract-utilities" Sep 30 13:46:06 crc kubenswrapper[4672]: E0930 13:46:06.455561 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="extract-utilities" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.455573 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="extract-utilities" Sep 30 13:46:06 crc kubenswrapper[4672]: E0930 13:46:06.455617 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="extract-content" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.455630 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="extract-content" Sep 30 13:46:06 crc kubenswrapper[4672]: E0930 13:46:06.455655 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="registry-server" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.455668 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="registry-server" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.456017 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a4227f-7343-4356-9bec-9a66e7066edb" containerName="registry-server" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.456074 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="83617f4e-336d-408f-b3e5-8b7eb08ae7a5" containerName="registry-server" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.456104 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b7a077-06bd-4f39-a1b7-e4692592ae68" containerName="tempest-tests-tempest-tests-runner" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.457422 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.467541 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-l6gc7" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.468300 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.590603 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9b2\" (UniqueName: \"kubernetes.io/projected/44cf0b0d-200e-475c-b8df-965a362a13b9-kube-api-access-9n9b2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"44cf0b0d-200e-475c-b8df-965a362a13b9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.590732 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"44cf0b0d-200e-475c-b8df-965a362a13b9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.707311 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9b2\" (UniqueName: \"kubernetes.io/projected/44cf0b0d-200e-475c-b8df-965a362a13b9-kube-api-access-9n9b2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"44cf0b0d-200e-475c-b8df-965a362a13b9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.707595 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"44cf0b0d-200e-475c-b8df-965a362a13b9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.709068 4672 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"44cf0b0d-200e-475c-b8df-965a362a13b9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.732649 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9b2\" (UniqueName: \"kubernetes.io/projected/44cf0b0d-200e-475c-b8df-965a362a13b9-kube-api-access-9n9b2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"44cf0b0d-200e-475c-b8df-965a362a13b9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.741350 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"44cf0b0d-200e-475c-b8df-965a362a13b9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:06 crc kubenswrapper[4672]: I0930 13:46:06.793051 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 13:46:07 crc kubenswrapper[4672]: I0930 13:46:07.346433 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 13:46:07 crc kubenswrapper[4672]: W0930 13:46:07.643115 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44cf0b0d_200e_475c_b8df_965a362a13b9.slice/crio-681bd2ca55d26e3a3994af123a3ea0243e478e04df29b78f38306c9517aebad5 WatchSource:0}: Error finding container 681bd2ca55d26e3a3994af123a3ea0243e478e04df29b78f38306c9517aebad5: Status 404 returned error can't find the container with id 681bd2ca55d26e3a3994af123a3ea0243e478e04df29b78f38306c9517aebad5 Sep 30 13:46:08 crc kubenswrapper[4672]: I0930 13:46:08.053473 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"44cf0b0d-200e-475c-b8df-965a362a13b9","Type":"ContainerStarted","Data":"681bd2ca55d26e3a3994af123a3ea0243e478e04df29b78f38306c9517aebad5"} Sep 30 13:46:09 crc kubenswrapper[4672]: I0930 13:46:09.066318 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"44cf0b0d-200e-475c-b8df-965a362a13b9","Type":"ContainerStarted","Data":"6cbcf545e911320ddd3e25d8564ef2b5857450e2c18b515f6d57b434b207fb51"} Sep 30 13:46:09 crc kubenswrapper[4672]: I0930 13:46:09.086776 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.92830989 podStartE2EDuration="3.086756688s" podCreationTimestamp="2025-09-30 13:46:06 +0000 UTC" firstStartedPulling="2025-09-30 13:46:07.646977616 +0000 UTC m=+5058.916215272" lastFinishedPulling="2025-09-30 13:46:08.805424414 +0000 UTC m=+5060.074662070" observedRunningTime="2025-09-30 13:46:09.079468552 +0000 UTC m=+5060.348706198" watchObservedRunningTime="2025-09-30 13:46:09.086756688 +0000 UTC m=+5060.355994334" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.648110 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjqwv/must-gather-g8nrf"] Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.650310 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.652615 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jjqwv"/"default-dockercfg-w7zh6" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.652807 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jjqwv"/"kube-root-ca.crt" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.652927 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jjqwv"/"openshift-service-ca.crt" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.659080 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjqwv/must-gather-g8nrf"] Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.799344 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fac7df9f-5203-4273-85cb-fdfdbcb34766-must-gather-output\") pod \"must-gather-g8nrf\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.799476 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv5c4\" (UniqueName: \"kubernetes.io/projected/fac7df9f-5203-4273-85cb-fdfdbcb34766-kube-api-access-cv5c4\") pod \"must-gather-g8nrf\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.902117 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fac7df9f-5203-4273-85cb-fdfdbcb34766-must-gather-output\") pod \"must-gather-g8nrf\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.902242 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv5c4\" (UniqueName: \"kubernetes.io/projected/fac7df9f-5203-4273-85cb-fdfdbcb34766-kube-api-access-cv5c4\") pod \"must-gather-g8nrf\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.902671 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fac7df9f-5203-4273-85cb-fdfdbcb34766-must-gather-output\") pod \"must-gather-g8nrf\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.922517 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv5c4\" (UniqueName: \"kubernetes.io/projected/fac7df9f-5203-4273-85cb-fdfdbcb34766-kube-api-access-cv5c4\") pod \"must-gather-g8nrf\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:27 crc kubenswrapper[4672]: I0930 13:46:27.970455 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:46:28 crc kubenswrapper[4672]: I0930 13:46:28.528696 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjqwv/must-gather-g8nrf"] Sep 30 13:46:29 crc kubenswrapper[4672]: I0930 13:46:29.307238 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" event={"ID":"fac7df9f-5203-4273-85cb-fdfdbcb34766","Type":"ContainerStarted","Data":"16bebaafc1607c86e3d1ec5d2b3013b7fe0125ab78a7611f95d79684f0ecf6b0"} Sep 30 13:46:35 crc kubenswrapper[4672]: I0930 13:46:35.367571 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" event={"ID":"fac7df9f-5203-4273-85cb-fdfdbcb34766","Type":"ContainerStarted","Data":"d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4"} Sep 30 13:46:35 crc kubenswrapper[4672]: I0930 13:46:35.368072 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" event={"ID":"fac7df9f-5203-4273-85cb-fdfdbcb34766","Type":"ContainerStarted","Data":"fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992"} Sep 30 13:46:35 crc kubenswrapper[4672]: I0930 13:46:35.389907 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" podStartSLOduration=2.480904945 podStartE2EDuration="8.389885608s" podCreationTimestamp="2025-09-30 13:46:27 +0000 UTC" firstStartedPulling="2025-09-30 13:46:28.534030645 +0000 UTC m=+5079.803268301" lastFinishedPulling="2025-09-30 13:46:34.443011318 +0000 UTC m=+5085.712248964" observedRunningTime="2025-09-30 13:46:35.379640717 +0000 UTC m=+5086.648878373" watchObservedRunningTime="2025-09-30 13:46:35.389885608 +0000 UTC m=+5086.659123294" Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.758226 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-b4kkx"] Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.760609 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.866354 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7979b082-f26c-4f54-b383-65d15b641e97-host\") pod \"crc-debug-b4kkx\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.866487 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwxj4\" (UniqueName: \"kubernetes.io/projected/7979b082-f26c-4f54-b383-65d15b641e97-kube-api-access-dwxj4\") pod \"crc-debug-b4kkx\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.967972 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7979b082-f26c-4f54-b383-65d15b641e97-host\") pod \"crc-debug-b4kkx\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.968079 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwxj4\" (UniqueName: \"kubernetes.io/projected/7979b082-f26c-4f54-b383-65d15b641e97-kube-api-access-dwxj4\") pod \"crc-debug-b4kkx\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.968128 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7979b082-f26c-4f54-b383-65d15b641e97-host\") pod \"crc-debug-b4kkx\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:38 crc kubenswrapper[4672]: I0930 13:46:38.987024 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwxj4\" (UniqueName: \"kubernetes.io/projected/7979b082-f26c-4f54-b383-65d15b641e97-kube-api-access-dwxj4\") pod \"crc-debug-b4kkx\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:39 crc kubenswrapper[4672]: I0930 13:46:39.098244 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:46:39 crc kubenswrapper[4672]: W0930 13:46:39.137072 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7979b082_f26c_4f54_b383_65d15b641e97.slice/crio-ae99f9c6941eaade78e1c99a1e666086ab50470e623187185229f47f8d22b136 WatchSource:0}: Error finding container ae99f9c6941eaade78e1c99a1e666086ab50470e623187185229f47f8d22b136: Status 404 returned error can't find the container with id ae99f9c6941eaade78e1c99a1e666086ab50470e623187185229f47f8d22b136 Sep 30 13:46:39 crc kubenswrapper[4672]: I0930 13:46:39.413342 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" event={"ID":"7979b082-f26c-4f54-b383-65d15b641e97","Type":"ContainerStarted","Data":"ae99f9c6941eaade78e1c99a1e666086ab50470e623187185229f47f8d22b136"} Sep 30 13:46:53 crc kubenswrapper[4672]: I0930 13:46:53.555293 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" event={"ID":"7979b082-f26c-4f54-b383-65d15b641e97","Type":"ContainerStarted","Data":"ef95dbab5412697860486ecdaa07f1dcdda1d8fe950ad991f020dbeea0535716"} Sep 30 13:46:53 crc kubenswrapper[4672]: I0930 13:46:53.570390 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" podStartSLOduration=2.06181044 podStartE2EDuration="15.570358122s" podCreationTimestamp="2025-09-30 13:46:38 +0000 UTC" firstStartedPulling="2025-09-30 13:46:39.140836479 +0000 UTC m=+5090.410074135" lastFinishedPulling="2025-09-30 13:46:52.649384171 +0000 UTC m=+5103.918621817" observedRunningTime="2025-09-30 13:46:53.569174652 +0000 UTC m=+5104.838412298" watchObservedRunningTime="2025-09-30 13:46:53.570358122 +0000 UTC m=+5104.839595768" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.367324 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2w7vc"] Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.371241 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.383976 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2w7vc"] Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.473006 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmtj\" (UniqueName: \"kubernetes.io/projected/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-kube-api-access-5rmtj\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.473392 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-catalog-content\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.473444 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-utilities\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.575592 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-catalog-content\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.575641 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-utilities\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.575740 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmtj\" (UniqueName: \"kubernetes.io/projected/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-kube-api-access-5rmtj\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.576466 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-catalog-content\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.576671 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-utilities\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.596976 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmtj\" (UniqueName: \"kubernetes.io/projected/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-kube-api-access-5rmtj\") pod \"community-operators-2w7vc\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:52 crc kubenswrapper[4672]: I0930 13:47:52.700401 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:47:53 crc kubenswrapper[4672]: I0930 13:47:53.315839 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2w7vc"] Sep 30 13:47:54 crc kubenswrapper[4672]: I0930 13:47:54.170326 4672 generic.go:334] "Generic (PLEG): container finished" podID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerID="a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2" exitCode=0 Sep 30 13:47:54 crc kubenswrapper[4672]: I0930 13:47:54.170495 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w7vc" event={"ID":"4718b940-b99a-42f9-ae3e-4e8c5042f7c8","Type":"ContainerDied","Data":"a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2"} Sep 30 13:47:54 crc kubenswrapper[4672]: I0930 13:47:54.170950 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w7vc" event={"ID":"4718b940-b99a-42f9-ae3e-4e8c5042f7c8","Type":"ContainerStarted","Data":"c4a1dba9a1d1dd06f109a272fd909eac2072092f719c5baf17d26fb53f06e5e8"} Sep 30 13:47:54 crc kubenswrapper[4672]: I0930 13:47:54.173837 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:47:55 crc kubenswrapper[4672]: I0930 13:47:55.184573 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w7vc" event={"ID":"4718b940-b99a-42f9-ae3e-4e8c5042f7c8","Type":"ContainerStarted","Data":"0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49"} Sep 30 13:47:56 crc kubenswrapper[4672]: I0930 13:47:56.198239 4672 generic.go:334] "Generic (PLEG): container finished" podID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerID="0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49" exitCode=0 Sep 30 13:47:56 crc kubenswrapper[4672]: I0930 13:47:56.198455 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w7vc" event={"ID":"4718b940-b99a-42f9-ae3e-4e8c5042f7c8","Type":"ContainerDied","Data":"0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49"} Sep 30 13:47:57 crc kubenswrapper[4672]: I0930 13:47:57.210537 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w7vc" event={"ID":"4718b940-b99a-42f9-ae3e-4e8c5042f7c8","Type":"ContainerStarted","Data":"09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2"} Sep 30 13:47:57 crc kubenswrapper[4672]: I0930 13:47:57.232162 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2w7vc" podStartSLOduration=2.701518377 podStartE2EDuration="5.232141596s" podCreationTimestamp="2025-09-30 13:47:52 +0000 UTC" firstStartedPulling="2025-09-30 13:47:54.172735802 +0000 UTC m=+5165.441973448" lastFinishedPulling="2025-09-30 13:47:56.703359011 +0000 UTC m=+5167.972596667" observedRunningTime="2025-09-30 13:47:57.231330645 +0000 UTC m=+5168.500568301" watchObservedRunningTime="2025-09-30 13:47:57.232141596 +0000 UTC m=+5168.501379242" Sep 30 13:48:02 crc kubenswrapper[4672]: I0930 13:48:02.700820 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:48:02 crc kubenswrapper[4672]: I0930 13:48:02.701445 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:48:02 crc kubenswrapper[4672]: I0930 13:48:02.758729 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:48:03 crc kubenswrapper[4672]: I0930 13:48:03.324887 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:48:03 crc kubenswrapper[4672]: I0930 13:48:03.375175 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2w7vc"] Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.292801 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2w7vc" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="registry-server" containerID="cri-o://09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2" gracePeriod=2 Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.786826 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.881535 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-catalog-content\") pod \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.881899 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rmtj\" (UniqueName: \"kubernetes.io/projected/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-kube-api-access-5rmtj\") pod \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.882076 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-utilities\") pod \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\" (UID: \"4718b940-b99a-42f9-ae3e-4e8c5042f7c8\") " Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.884887 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-utilities" (OuterVolumeSpecName: "utilities") pod "4718b940-b99a-42f9-ae3e-4e8c5042f7c8" (UID: "4718b940-b99a-42f9-ae3e-4e8c5042f7c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.896609 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-kube-api-access-5rmtj" (OuterVolumeSpecName: "kube-api-access-5rmtj") pod "4718b940-b99a-42f9-ae3e-4e8c5042f7c8" (UID: "4718b940-b99a-42f9-ae3e-4e8c5042f7c8"). InnerVolumeSpecName "kube-api-access-5rmtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.943177 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4718b940-b99a-42f9-ae3e-4e8c5042f7c8" (UID: "4718b940-b99a-42f9-ae3e-4e8c5042f7c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.983711 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.983751 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rmtj\" (UniqueName: \"kubernetes.io/projected/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-kube-api-access-5rmtj\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:05 crc kubenswrapper[4672]: I0930 13:48:05.983761 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4718b940-b99a-42f9-ae3e-4e8c5042f7c8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.303501 4672 generic.go:334] "Generic (PLEG): container finished" podID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerID="09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2" exitCode=0 Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.303546 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w7vc" event={"ID":"4718b940-b99a-42f9-ae3e-4e8c5042f7c8","Type":"ContainerDied","Data":"09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2"} Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.303562 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w7vc" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.303585 4672 scope.go:117] "RemoveContainer" containerID="09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.303573 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w7vc" event={"ID":"4718b940-b99a-42f9-ae3e-4e8c5042f7c8","Type":"ContainerDied","Data":"c4a1dba9a1d1dd06f109a272fd909eac2072092f719c5baf17d26fb53f06e5e8"} Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.334320 4672 scope.go:117] "RemoveContainer" containerID="0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.355561 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2w7vc"] Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.363408 4672 scope.go:117] "RemoveContainer" containerID="a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.365284 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2w7vc"] Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.407323 4672 scope.go:117] "RemoveContainer" containerID="09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2" Sep 30 13:48:06 crc kubenswrapper[4672]: E0930 13:48:06.407839 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2\": container with ID starting with 09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2 not found: ID does not exist" containerID="09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.407883 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2"} err="failed to get container status \"09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2\": rpc error: code = NotFound desc = could not find container \"09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2\": container with ID starting with 09bea694f78a65ce13d2966818bbd175ecadf5228d3a5cbcb70bd938669851e2 not found: ID does not exist" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.407917 4672 scope.go:117] "RemoveContainer" containerID="0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49" Sep 30 13:48:06 crc kubenswrapper[4672]: E0930 13:48:06.408426 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49\": container with ID starting with 0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49 not found: ID does not exist" containerID="0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.408470 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49"} err="failed to get container status \"0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49\": rpc error: code = NotFound desc = could not find container \"0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49\": container with ID starting with 0caf9cecc40a5ee69004c59171b755aa3fcc6e2e0f46eaee102692f1a1530d49 not found: ID does not exist" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.408499 4672 scope.go:117] "RemoveContainer" containerID="a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2" Sep 30 13:48:06 crc kubenswrapper[4672]: E0930 13:48:06.408897 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2\": container with ID starting with a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2 not found: ID does not exist" containerID="a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.408927 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2"} err="failed to get container status \"a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2\": rpc error: code = NotFound desc = could not find container \"a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2\": container with ID starting with a008c434b0db844765da6dd5f708c8e4318f9bc0b7fc714f7722568d308c9bb2 not found: ID does not exist" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.589153 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c888d4d9d-4nr45_7b67355f-8081-4014-ad68-6e31faa794b1/barbican-api/0.log" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.600848 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c888d4d9d-4nr45_7b67355f-8081-4014-ad68-6e31faa794b1/barbican-api-log/0.log" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.797643 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-654fbcfdf6-vvhwm_7212b048-8992-4678-b985-05a5c1fc8818/barbican-keystone-listener/0.log" Sep 30 13:48:06 crc kubenswrapper[4672]: I0930 13:48:06.874700 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-654fbcfdf6-vvhwm_7212b048-8992-4678-b985-05a5c1fc8818/barbican-keystone-listener-log/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.011709 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bd7cdf79c-ddf9q_e338321e-04aa-4ed3-8b3c-0baf6888f64f/barbican-worker/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.058185 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bd7cdf79c-ddf9q_e338321e-04aa-4ed3-8b3c-0baf6888f64f/barbican-worker-log/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.298616 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8_91de1b76-2b84-4d21-9683-d7aee98fb876/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.431211 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" path="/var/lib/kubelet/pods/4718b940-b99a-42f9-ae3e-4e8c5042f7c8/volumes" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.540454 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/ceilometer-notification-agent/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.606982 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/proxy-httpd/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.608797 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/ceilometer-central-agent/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.761461 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/sg-core/0.log" Sep 30 13:48:07 crc kubenswrapper[4672]: I0930 13:48:07.974995 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd/cinder-api-log/0.log" Sep 30 13:48:08 crc kubenswrapper[4672]: I0930 13:48:08.031352 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd/cinder-api/0.log" Sep 30 13:48:08 crc kubenswrapper[4672]: I0930 13:48:08.267469 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3140cbea-70fb-4d82-90d3-fa12c43fcf76/cinder-scheduler/0.log" Sep 30 13:48:08 crc kubenswrapper[4672]: I0930 13:48:08.307906 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3140cbea-70fb-4d82-90d3-fa12c43fcf76/probe/0.log" Sep 30 13:48:08 crc kubenswrapper[4672]: I0930 13:48:08.459101 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kft7n_6354da04-65da-4562-9e78-563e1fb4f4fe/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:08 crc kubenswrapper[4672]: I0930 13:48:08.526804 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-29qlm_847b8779-d63c-4bbb-9b51-94a2c102e36d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:08 crc kubenswrapper[4672]: I0930 13:48:08.730615 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd_29628904-dd3c-4ce7-a114-552159673def/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:08 crc kubenswrapper[4672]: I0930 13:48:08.832152 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64bcc76c55-fkj7b_cdfacf2c-b616-4c40-b16e-ec39de0d0e21/init/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.043136 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64bcc76c55-fkj7b_cdfacf2c-b616-4c40-b16e-ec39de0d0e21/init/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.217114 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64bcc76c55-fkj7b_cdfacf2c-b616-4c40-b16e-ec39de0d0e21/dnsmasq-dns/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.295641 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8_a88c5cde-cba5-457b-8044-77ed9db4c080/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.492256 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d7983763-9bc4-4528-9cf4-f2d693c42c5f/glance-log/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.513190 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d7983763-9bc4-4528-9cf4-f2d693c42c5f/glance-httpd/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.731033 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a/glance-log/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.778357 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a/glance-httpd/0.log" Sep 30 13:48:09 crc kubenswrapper[4672]: I0930 13:48:09.954641 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-844b6c9474-6tpzt_2659b35e-ecb1-416b-8a94-690759645536/horizon/0.log" Sep 30 13:48:10 crc kubenswrapper[4672]: I0930 13:48:10.106750 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qjc69_e44e3b27-d209-49da-93b2-ed646da0650e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:10 crc kubenswrapper[4672]: I0930 13:48:10.316587 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-669k5_5e4bf526-356e-4b1f-a69e-7da92365808d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:10 crc kubenswrapper[4672]: I0930 13:48:10.556901 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-844b6c9474-6tpzt_2659b35e-ecb1-416b-8a94-690759645536/horizon-log/0.log" Sep 30 13:48:10 crc kubenswrapper[4672]: I0930 13:48:10.603038 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320621-r2bv7_a0837a49-9f57-447b-8da5-feef49bf42f0/keystone-cron/0.log" Sep 30 13:48:10 crc kubenswrapper[4672]: I0930 13:48:10.791952 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_32acef5a-c440-4574-9a53-18754f15acc6/kube-state-metrics/0.log" Sep 30 13:48:10 crc kubenswrapper[4672]: I0930 13:48:10.924413 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c5bf9886d-9nhb9_1871c14e-9602-478e-888f-31d273376456/keystone-api/0.log" Sep 30 13:48:11 crc kubenswrapper[4672]: I0930 13:48:11.045068 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-zq47z_883bcbaa-0233-4f4d-8463-f451155bc618/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:11 crc kubenswrapper[4672]: I0930 13:48:11.516488 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7c4888f-v7vn2_f394ad91-f6fb-4d7a-8508-d8fede494686/neutron-httpd/0.log" Sep 30 13:48:11 crc kubenswrapper[4672]: I0930 13:48:11.582807 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7c4888f-v7vn2_f394ad91-f6fb-4d7a-8508-d8fede494686/neutron-api/0.log" Sep 30 13:48:11 crc kubenswrapper[4672]: I0930 13:48:11.731144 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64_ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:13 crc kubenswrapper[4672]: I0930 13:48:13.070172 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fdadbc89-4050-4b7f-bf2b-70e405b18974/nova-cell0-conductor-conductor/0.log" Sep 30 13:48:13 crc kubenswrapper[4672]: I0930 13:48:13.264961 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_77091ef9-bf9b-4b0b-aacd-c46a576974a8/nova-api-log/0.log" Sep 30 13:48:13 crc kubenswrapper[4672]: I0930 13:48:13.385547 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_77091ef9-bf9b-4b0b-aacd-c46a576974a8/nova-api-api/0.log" Sep 30 13:48:13 crc kubenswrapper[4672]: I0930 13:48:13.613990 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5017308f-acf6-406c-8c75-3f6b550f8190/nova-cell1-conductor-conductor/0.log" Sep 30 13:48:13 crc kubenswrapper[4672]: I0930 13:48:13.721903 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_92487d22-391c-44e2-8179-1e523ab07026/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 13:48:13 crc kubenswrapper[4672]: I0930 13:48:13.864069 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qmh4q_727d1f8a-6b85-4184-b669-3fe8b94c608a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:14 crc kubenswrapper[4672]: I0930 13:48:14.133860 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_db8c9818-e7bc-471f-b7d6-b097f3657451/nova-metadata-log/0.log" Sep 30 13:48:14 crc kubenswrapper[4672]: I0930 13:48:14.839444 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9159d76a-52b7-4262-a56a-ed28caec7f97/mysql-bootstrap/0.log" Sep 30 13:48:14 crc kubenswrapper[4672]: I0930 13:48:14.898520 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e9e19e97-1725-4846-93ef-b00bf092908b/nova-scheduler-scheduler/0.log" Sep 30 13:48:15 crc kubenswrapper[4672]: I0930 13:48:15.100711 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9159d76a-52b7-4262-a56a-ed28caec7f97/galera/0.log" Sep 30 13:48:15 crc kubenswrapper[4672]: I0930 13:48:15.127653 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9159d76a-52b7-4262-a56a-ed28caec7f97/mysql-bootstrap/0.log" Sep 30 13:48:15 crc kubenswrapper[4672]: I0930 13:48:15.344225 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_056e0424-1faf-4d5a-8aea-e351214b3394/mysql-bootstrap/0.log" Sep 30 13:48:15 crc kubenswrapper[4672]: I0930 13:48:15.598916 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_056e0424-1faf-4d5a-8aea-e351214b3394/mysql-bootstrap/0.log" Sep 30 13:48:15 crc kubenswrapper[4672]: I0930 13:48:15.641047 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_056e0424-1faf-4d5a-8aea-e351214b3394/galera/0.log" Sep 30 13:48:15 crc kubenswrapper[4672]: I0930 13:48:15.894637 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_28f655e1-08b3-4618-8864-2020e883f99c/openstackclient/0.log" Sep 30 13:48:16 crc kubenswrapper[4672]: I0930 13:48:16.195762 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_db8c9818-e7bc-471f-b7d6-b097f3657451/nova-metadata-metadata/0.log" Sep 30 13:48:16 crc kubenswrapper[4672]: I0930 13:48:16.309693 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gln7l_f11c1379-e576-40be-a37a-1d73f84cab81/openstack-network-exporter/0.log" Sep 30 13:48:16 crc kubenswrapper[4672]: I0930 13:48:16.531684 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovsdb-server-init/0.log" Sep 30 13:48:16 crc kubenswrapper[4672]: I0930 13:48:16.752407 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovsdb-server-init/0.log" Sep 30 13:48:16 crc kubenswrapper[4672]: I0930 13:48:16.796934 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovsdb-server/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.020594 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vhs7r_9f35be26-490e-49db-bd31-32ce35c84fab/ovn-controller/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.084635 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovs-vswitchd/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.243369 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-mfrgw_de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.469620 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_879d3e20-6df1-45be-bebd-b7e990e0aa5f/openstack-network-exporter/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.516199 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_879d3e20-6df1-45be-bebd-b7e990e0aa5f/ovn-northd/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.657377 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d69b4d7c-be99-4405-aae9-8a11b85632b8/openstack-network-exporter/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.746273 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d69b4d7c-be99-4405-aae9-8a11b85632b8/ovsdbserver-nb/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.868141 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4e0bc671-11e7-442d-b5f3-4a901b0a0a80/openstack-network-exporter/0.log" Sep 30 13:48:17 crc kubenswrapper[4672]: I0930 13:48:17.951519 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4e0bc671-11e7-442d-b5f3-4a901b0a0a80/ovsdbserver-sb/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.213743 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b77669d6-hlcjq_1834109c-113d-4231-94e6-0796ef06015d/placement-api/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.393043 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b77669d6-hlcjq_1834109c-113d-4231-94e6-0796ef06015d/placement-log/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.428604 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/init-config-reloader/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.630156 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/init-config-reloader/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.665373 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/config-reloader/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.672381 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/prometheus/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.861348 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/thanos-sidecar/0.log" Sep 30 13:48:18 crc kubenswrapper[4672]: I0930 13:48:18.927192 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00b691a7-21bd-4661-9b19-cae31a79f18e/setup-container/0.log" Sep 30 13:48:19 crc kubenswrapper[4672]: I0930 13:48:19.153869 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00b691a7-21bd-4661-9b19-cae31a79f18e/setup-container/0.log" Sep 30 13:48:19 crc kubenswrapper[4672]: I0930 13:48:19.157373 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00b691a7-21bd-4661-9b19-cae31a79f18e/rabbitmq/0.log" Sep 30 13:48:19 crc kubenswrapper[4672]: I0930 13:48:19.369860 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d165a3a8-6809-46e5-bd35-895200ab5bfc/setup-container/0.log" Sep 30 13:48:19 crc kubenswrapper[4672]: I0930 13:48:19.556755 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d165a3a8-6809-46e5-bd35-895200ab5bfc/setup-container/0.log" Sep 30 13:48:19 crc kubenswrapper[4672]: I0930 13:48:19.562707 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d165a3a8-6809-46e5-bd35-895200ab5bfc/rabbitmq/0.log" Sep 30 13:48:19 crc kubenswrapper[4672]: I0930 13:48:19.806544 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cfe6bf3-4d65-49c0-a45b-484e53a12f80/setup-container/0.log" Sep 30 13:48:20 crc kubenswrapper[4672]: I0930 13:48:20.039961 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cfe6bf3-4d65-49c0-a45b-484e53a12f80/setup-container/0.log" Sep 30 13:48:20 crc kubenswrapper[4672]: I0930 13:48:20.056666 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cfe6bf3-4d65-49c0-a45b-484e53a12f80/rabbitmq/0.log" Sep 30 13:48:20 crc kubenswrapper[4672]: I0930 13:48:20.239546 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl_c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:20 crc kubenswrapper[4672]: I0930 13:48:20.315501 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kz9kk_c67e6191-fd96-4caf-a6fd-6d5a7013f069/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:20 crc kubenswrapper[4672]: I0930 13:48:20.521937 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w_9ac7c321-e380-4baa-8233-0ec24fa6496f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:20 crc kubenswrapper[4672]: I0930 13:48:20.723510 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bh9j6_754f9f15-2c0e-4279-aec1-589d1b23eb75/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:20 crc kubenswrapper[4672]: I0930 13:48:20.790607 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9csw2_c303b53d-3c71-498b-99fb-432610f75b61/ssh-known-hosts-edpm-deployment/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.082318 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc57698f-2dqjq_eb422aba-f5c2-4822-bd48-bba56e4dc451/proxy-server/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.197199 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc57698f-2dqjq_eb422aba-f5c2-4822-bd48-bba56e4dc451/proxy-httpd/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.267543 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-d7dm7_4b567440-2a47-4032-bd02-6d7d53ea35b8/swift-ring-rebalance/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.420032 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-auditor/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.537893 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-reaper/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.659881 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-replicator/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.718389 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-server/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.786643 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-auditor/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.870122 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-server/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.899011 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-replicator/0.log" Sep 30 13:48:21 crc kubenswrapper[4672]: I0930 13:48:21.981949 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-updater/0.log" Sep 30 13:48:22 crc kubenswrapper[4672]: I0930 13:48:22.118446 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-auditor/0.log" Sep 30 13:48:22 crc kubenswrapper[4672]: I0930 13:48:22.143837 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-expirer/0.log" Sep 30 13:48:22 crc kubenswrapper[4672]: I0930 13:48:22.223678 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-replicator/0.log" Sep 30 13:48:22 crc kubenswrapper[4672]: I0930 13:48:22.947484 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/rsync/0.log" Sep 30 13:48:22 crc kubenswrapper[4672]: I0930 13:48:22.962474 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-server/0.log" Sep 30 13:48:22 crc kubenswrapper[4672]: I0930 13:48:22.964462 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-updater/0.log" Sep 30 13:48:23 crc kubenswrapper[4672]: I0930 13:48:23.159677 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/swift-recon-cron/0.log" Sep 30 13:48:23 crc kubenswrapper[4672]: I0930 13:48:23.256926 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5_38c0d8da-6872-4108-aaa8-1b8fa2611fe5/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:23 crc kubenswrapper[4672]: I0930 13:48:23.505097 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_42b7a077-06bd-4f39-a1b7-e4692592ae68/tempest-tests-tempest-tests-runner/0.log" Sep 30 13:48:23 crc kubenswrapper[4672]: I0930 13:48:23.680564 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_44cf0b0d-200e-475c-b8df-965a362a13b9/test-operator-logs-container/0.log" Sep 30 13:48:23 crc kubenswrapper[4672]: I0930 13:48:23.756664 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2_beb806bb-fd09-449f-939b-cccb4ffe11de/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:48:24 crc kubenswrapper[4672]: I0930 13:48:24.739018 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:48:24 crc kubenswrapper[4672]: I0930 13:48:24.739336 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:48:25 crc kubenswrapper[4672]: I0930 13:48:25.180814 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_0944504c-77dc-42f3-a981-723fea76118c/watcher-applier/0.log" Sep 30 13:48:25 crc kubenswrapper[4672]: I0930 13:48:25.403093 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_179ce1e6-946c-4e3c-97d6-38764daf4214/watcher-api-log/0.log" Sep 30 13:48:25 crc kubenswrapper[4672]: I0930 13:48:25.689617 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_4f1bee84-650b-4f0b-a657-e6701ee51823/watcher-decision-engine/2.log" Sep 30 13:48:28 crc kubenswrapper[4672]: I0930 13:48:28.521219 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_4f1bee84-650b-4f0b-a657-e6701ee51823/watcher-decision-engine/3.log" Sep 30 13:48:29 crc kubenswrapper[4672]: I0930 13:48:29.218781 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_179ce1e6-946c-4e3c-97d6-38764daf4214/watcher-api/0.log" Sep 30 13:48:40 crc kubenswrapper[4672]: I0930 13:48:40.885045 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_34e59a30-14b8-4736-87b1-9d9581094598/memcached/0.log" Sep 30 13:48:54 crc kubenswrapper[4672]: I0930 13:48:54.739757 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:48:54 crc kubenswrapper[4672]: I0930 13:48:54.740369 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:49:01 crc kubenswrapper[4672]: I0930 13:49:01.916977 4672 generic.go:334] "Generic (PLEG): container finished" podID="7979b082-f26c-4f54-b383-65d15b641e97" containerID="ef95dbab5412697860486ecdaa07f1dcdda1d8fe950ad991f020dbeea0535716" exitCode=0 Sep 30 13:49:01 crc kubenswrapper[4672]: I0930 13:49:01.917120 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" event={"ID":"7979b082-f26c-4f54-b383-65d15b641e97","Type":"ContainerDied","Data":"ef95dbab5412697860486ecdaa07f1dcdda1d8fe950ad991f020dbeea0535716"} Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.041906 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.070859 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-b4kkx"] Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.079683 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-b4kkx"] Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.141631 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwxj4\" (UniqueName: \"kubernetes.io/projected/7979b082-f26c-4f54-b383-65d15b641e97-kube-api-access-dwxj4\") pod \"7979b082-f26c-4f54-b383-65d15b641e97\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.141850 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7979b082-f26c-4f54-b383-65d15b641e97-host\") pod \"7979b082-f26c-4f54-b383-65d15b641e97\" (UID: \"7979b082-f26c-4f54-b383-65d15b641e97\") " Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.142023 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7979b082-f26c-4f54-b383-65d15b641e97-host" (OuterVolumeSpecName: "host") pod "7979b082-f26c-4f54-b383-65d15b641e97" (UID: "7979b082-f26c-4f54-b383-65d15b641e97"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.142493 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7979b082-f26c-4f54-b383-65d15b641e97-host\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.148030 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7979b082-f26c-4f54-b383-65d15b641e97-kube-api-access-dwxj4" (OuterVolumeSpecName: "kube-api-access-dwxj4") pod "7979b082-f26c-4f54-b383-65d15b641e97" (UID: "7979b082-f26c-4f54-b383-65d15b641e97"). InnerVolumeSpecName "kube-api-access-dwxj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.244796 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwxj4\" (UniqueName: \"kubernetes.io/projected/7979b082-f26c-4f54-b383-65d15b641e97-kube-api-access-dwxj4\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.429114 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7979b082-f26c-4f54-b383-65d15b641e97" path="/var/lib/kubelet/pods/7979b082-f26c-4f54-b383-65d15b641e97/volumes" Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.938436 4672 scope.go:117] "RemoveContainer" containerID="ef95dbab5412697860486ecdaa07f1dcdda1d8fe950ad991f020dbeea0535716" Sep 30 13:49:03 crc kubenswrapper[4672]: I0930 13:49:03.938532 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-b4kkx" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.253654 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-t5vwz"] Sep 30 13:49:04 crc kubenswrapper[4672]: E0930 13:49:04.254199 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="extract-utilities" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.254219 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="extract-utilities" Sep 30 13:49:04 crc kubenswrapper[4672]: E0930 13:49:04.254234 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="registry-server" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.254246 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="registry-server" Sep 30 13:49:04 crc kubenswrapper[4672]: E0930 13:49:04.254294 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7979b082-f26c-4f54-b383-65d15b641e97" containerName="container-00" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.254303 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7979b082-f26c-4f54-b383-65d15b641e97" containerName="container-00" Sep 30 13:49:04 crc kubenswrapper[4672]: E0930 13:49:04.254320 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="extract-content" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.254329 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="extract-content" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.254600 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="4718b940-b99a-42f9-ae3e-4e8c5042f7c8" containerName="registry-server" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.254639 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7979b082-f26c-4f54-b383-65d15b641e97" containerName="container-00" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.255467 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.372464 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbsr2\" (UniqueName: \"kubernetes.io/projected/ea8192f8-33e3-44f4-8089-a94f2886abe6-kube-api-access-nbsr2\") pod \"crc-debug-t5vwz\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.372846 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8192f8-33e3-44f4-8089-a94f2886abe6-host\") pod \"crc-debug-t5vwz\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.474957 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbsr2\" (UniqueName: \"kubernetes.io/projected/ea8192f8-33e3-44f4-8089-a94f2886abe6-kube-api-access-nbsr2\") pod \"crc-debug-t5vwz\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.475153 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8192f8-33e3-44f4-8089-a94f2886abe6-host\") pod \"crc-debug-t5vwz\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.475288 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8192f8-33e3-44f4-8089-a94f2886abe6-host\") pod \"crc-debug-t5vwz\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.492945 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbsr2\" (UniqueName: \"kubernetes.io/projected/ea8192f8-33e3-44f4-8089-a94f2886abe6-kube-api-access-nbsr2\") pod \"crc-debug-t5vwz\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.572766 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:04 crc kubenswrapper[4672]: W0930 13:49:04.623312 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea8192f8_33e3_44f4_8089_a94f2886abe6.slice/crio-9454b36e2536566c15ab4946151d7f351419fe96735e35b30915ba7df2614d9c WatchSource:0}: Error finding container 9454b36e2536566c15ab4946151d7f351419fe96735e35b30915ba7df2614d9c: Status 404 returned error can't find the container with id 9454b36e2536566c15ab4946151d7f351419fe96735e35b30915ba7df2614d9c Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.951864 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" event={"ID":"ea8192f8-33e3-44f4-8089-a94f2886abe6","Type":"ContainerStarted","Data":"77c2527175aa60c1a25804ffda0b881f674398466a5e16a4092b83d5ebe1f6eb"} Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.952328 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" event={"ID":"ea8192f8-33e3-44f4-8089-a94f2886abe6","Type":"ContainerStarted","Data":"9454b36e2536566c15ab4946151d7f351419fe96735e35b30915ba7df2614d9c"} Sep 30 13:49:04 crc kubenswrapper[4672]: I0930 13:49:04.979610 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" podStartSLOduration=0.979579395 podStartE2EDuration="979.579395ms" podCreationTimestamp="2025-09-30 13:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:49:04.96838721 +0000 UTC m=+5236.237624916" watchObservedRunningTime="2025-09-30 13:49:04.979579395 +0000 UTC m=+5236.248817071" Sep 30 13:49:05 crc kubenswrapper[4672]: I0930 13:49:05.970835 4672 generic.go:334] "Generic (PLEG): container finished" podID="ea8192f8-33e3-44f4-8089-a94f2886abe6" containerID="77c2527175aa60c1a25804ffda0b881f674398466a5e16a4092b83d5ebe1f6eb" exitCode=0 Sep 30 13:49:05 crc kubenswrapper[4672]: I0930 13:49:05.970924 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" event={"ID":"ea8192f8-33e3-44f4-8089-a94f2886abe6","Type":"ContainerDied","Data":"77c2527175aa60c1a25804ffda0b881f674398466a5e16a4092b83d5ebe1f6eb"} Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.091173 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.231070 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbsr2\" (UniqueName: \"kubernetes.io/projected/ea8192f8-33e3-44f4-8089-a94f2886abe6-kube-api-access-nbsr2\") pod \"ea8192f8-33e3-44f4-8089-a94f2886abe6\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.231814 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8192f8-33e3-44f4-8089-a94f2886abe6-host\") pod \"ea8192f8-33e3-44f4-8089-a94f2886abe6\" (UID: \"ea8192f8-33e3-44f4-8089-a94f2886abe6\") " Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.231946 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea8192f8-33e3-44f4-8089-a94f2886abe6-host" (OuterVolumeSpecName: "host") pod "ea8192f8-33e3-44f4-8089-a94f2886abe6" (UID: "ea8192f8-33e3-44f4-8089-a94f2886abe6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.232447 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea8192f8-33e3-44f4-8089-a94f2886abe6-host\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.746568 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8192f8-33e3-44f4-8089-a94f2886abe6-kube-api-access-nbsr2" (OuterVolumeSpecName: "kube-api-access-nbsr2") pod "ea8192f8-33e3-44f4-8089-a94f2886abe6" (UID: "ea8192f8-33e3-44f4-8089-a94f2886abe6"). InnerVolumeSpecName "kube-api-access-nbsr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.843300 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbsr2\" (UniqueName: \"kubernetes.io/projected/ea8192f8-33e3-44f4-8089-a94f2886abe6-kube-api-access-nbsr2\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.995153 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" event={"ID":"ea8192f8-33e3-44f4-8089-a94f2886abe6","Type":"ContainerDied","Data":"9454b36e2536566c15ab4946151d7f351419fe96735e35b30915ba7df2614d9c"} Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.995206 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-t5vwz" Sep 30 13:49:07 crc kubenswrapper[4672]: I0930 13:49:07.995283 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9454b36e2536566c15ab4946151d7f351419fe96735e35b30915ba7df2614d9c" Sep 30 13:49:14 crc kubenswrapper[4672]: I0930 13:49:14.416490 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-t5vwz"] Sep 30 13:49:14 crc kubenswrapper[4672]: I0930 13:49:14.430519 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-t5vwz"] Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.431866 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8192f8-33e3-44f4-8089-a94f2886abe6" path="/var/lib/kubelet/pods/ea8192f8-33e3-44f4-8089-a94f2886abe6/volumes" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.615744 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-8q54n"] Sep 30 13:49:15 crc kubenswrapper[4672]: E0930 13:49:15.616232 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8192f8-33e3-44f4-8089-a94f2886abe6" containerName="container-00" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.616250 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8192f8-33e3-44f4-8089-a94f2886abe6" containerName="container-00" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.616524 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8192f8-33e3-44f4-8089-a94f2886abe6" containerName="container-00" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.617326 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.680318 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpfkm\" (UniqueName: \"kubernetes.io/projected/6bda37ea-464e-48cb-98dd-342144aadb43-kube-api-access-hpfkm\") pod \"crc-debug-8q54n\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.680719 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bda37ea-464e-48cb-98dd-342144aadb43-host\") pod \"crc-debug-8q54n\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.789104 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpfkm\" (UniqueName: \"kubernetes.io/projected/6bda37ea-464e-48cb-98dd-342144aadb43-kube-api-access-hpfkm\") pod \"crc-debug-8q54n\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.789209 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bda37ea-464e-48cb-98dd-342144aadb43-host\") pod \"crc-debug-8q54n\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:15 crc kubenswrapper[4672]: I0930 13:49:15.789481 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bda37ea-464e-48cb-98dd-342144aadb43-host\") pod \"crc-debug-8q54n\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:16 crc kubenswrapper[4672]: I0930 13:49:16.140959 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpfkm\" (UniqueName: \"kubernetes.io/projected/6bda37ea-464e-48cb-98dd-342144aadb43-kube-api-access-hpfkm\") pod \"crc-debug-8q54n\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:16 crc kubenswrapper[4672]: I0930 13:49:16.254308 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:17 crc kubenswrapper[4672]: I0930 13:49:17.090974 4672 generic.go:334] "Generic (PLEG): container finished" podID="6bda37ea-464e-48cb-98dd-342144aadb43" containerID="21a532b46f0613cc5b5a54113b60ed9653a7ae00237a94bfdc3e28bd6f4a01db" exitCode=0 Sep 30 13:49:17 crc kubenswrapper[4672]: I0930 13:49:17.091081 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-8q54n" event={"ID":"6bda37ea-464e-48cb-98dd-342144aadb43","Type":"ContainerDied","Data":"21a532b46f0613cc5b5a54113b60ed9653a7ae00237a94bfdc3e28bd6f4a01db"} Sep 30 13:49:17 crc kubenswrapper[4672]: I0930 13:49:17.091596 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/crc-debug-8q54n" event={"ID":"6bda37ea-464e-48cb-98dd-342144aadb43","Type":"ContainerStarted","Data":"ca89a4d527e3037ec3820106d100795caf5789577f5ae5322eb70c380359ee0d"} Sep 30 13:49:17 crc kubenswrapper[4672]: I0930 13:49:17.137787 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-8q54n"] Sep 30 13:49:17 crc kubenswrapper[4672]: I0930 13:49:17.149501 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjqwv/crc-debug-8q54n"] Sep 30 13:49:18 crc kubenswrapper[4672]: I0930 13:49:18.597390 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:18 crc kubenswrapper[4672]: I0930 13:49:18.767156 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bda37ea-464e-48cb-98dd-342144aadb43-host\") pod \"6bda37ea-464e-48cb-98dd-342144aadb43\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " Sep 30 13:49:18 crc kubenswrapper[4672]: I0930 13:49:18.767384 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpfkm\" (UniqueName: \"kubernetes.io/projected/6bda37ea-464e-48cb-98dd-342144aadb43-kube-api-access-hpfkm\") pod \"6bda37ea-464e-48cb-98dd-342144aadb43\" (UID: \"6bda37ea-464e-48cb-98dd-342144aadb43\") " Sep 30 13:49:18 crc kubenswrapper[4672]: I0930 13:49:18.768707 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bda37ea-464e-48cb-98dd-342144aadb43-host" (OuterVolumeSpecName: "host") pod "6bda37ea-464e-48cb-98dd-342144aadb43" (UID: "6bda37ea-464e-48cb-98dd-342144aadb43"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:49:18 crc kubenswrapper[4672]: I0930 13:49:18.773661 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bda37ea-464e-48cb-98dd-342144aadb43-kube-api-access-hpfkm" (OuterVolumeSpecName: "kube-api-access-hpfkm") pod "6bda37ea-464e-48cb-98dd-342144aadb43" (UID: "6bda37ea-464e-48cb-98dd-342144aadb43"). InnerVolumeSpecName "kube-api-access-hpfkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:49:18 crc kubenswrapper[4672]: I0930 13:49:18.869584 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpfkm\" (UniqueName: \"kubernetes.io/projected/6bda37ea-464e-48cb-98dd-342144aadb43-kube-api-access-hpfkm\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:18 crc kubenswrapper[4672]: I0930 13:49:18.869626 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6bda37ea-464e-48cb-98dd-342144aadb43-host\") on node \"crc\" DevicePath \"\"" Sep 30 13:49:19 crc kubenswrapper[4672]: I0930 13:49:19.117364 4672 scope.go:117] "RemoveContainer" containerID="21a532b46f0613cc5b5a54113b60ed9653a7ae00237a94bfdc3e28bd6f4a01db" Sep 30 13:49:19 crc kubenswrapper[4672]: I0930 13:49:19.117395 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/crc-debug-8q54n" Sep 30 13:49:19 crc kubenswrapper[4672]: E0930 13:49:19.374794 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bda37ea_464e_48cb_98dd_342144aadb43.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bda37ea_464e_48cb_98dd_342144aadb43.slice/crio-ca89a4d527e3037ec3820106d100795caf5789577f5ae5322eb70c380359ee0d\": RecentStats: unable to find data in memory cache]" Sep 30 13:49:19 crc kubenswrapper[4672]: I0930 13:49:19.429740 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bda37ea-464e-48cb-98dd-342144aadb43" path="/var/lib/kubelet/pods/6bda37ea-464e-48cb-98dd-342144aadb43/volumes" Sep 30 13:49:19 crc kubenswrapper[4672]: I0930 13:49:19.667219 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/util/0.log" Sep 30 13:49:19 crc kubenswrapper[4672]: I0930 13:49:19.859496 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/util/0.log" Sep 30 13:49:19 crc kubenswrapper[4672]: I0930 13:49:19.865720 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/pull/0.log" Sep 30 13:49:19 crc kubenswrapper[4672]: I0930 13:49:19.871483 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/pull/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.065733 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/pull/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.082698 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/util/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.097451 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/extract/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.231203 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-df89s_c70c8b28-1f45-4c79-af69-3197c7f66fa0/kube-rbac-proxy/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.351176 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-jmkcr_892875f4-bce3-47cb-8478-9d6bbc819bb1/kube-rbac-proxy/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.406705 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-df89s_c70c8b28-1f45-4c79-af69-3197c7f66fa0/manager/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.476810 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-jmkcr_892875f4-bce3-47cb-8478-9d6bbc819bb1/manager/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.578852 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-gthb7_6cc4cd4e-abd0-4318-bfd4-e2df45940139/kube-rbac-proxy/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.636041 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-gthb7_6cc4cd4e-abd0-4318-bfd4-e2df45940139/manager/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.736059 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fn2cb_0e2c3398-4a1f-4a82-a95c-89e73d9a4485/kube-rbac-proxy/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.869353 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fn2cb_0e2c3398-4a1f-4a82-a95c-89e73d9a4485/manager/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.966334 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-vs4mr_b12c1847-2238-4a91-a2a0-4de492556fe7/manager/0.log" Sep 30 13:49:20 crc kubenswrapper[4672]: I0930 13:49:20.977184 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-vs4mr_b12c1847-2238-4a91-a2a0-4de492556fe7/kube-rbac-proxy/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.068943 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-z9tmm_ed25b409-1ca7-4fc4-95b5-55b4239233f3/kube-rbac-proxy/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.164250 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-z9tmm_ed25b409-1ca7-4fc4-95b5-55b4239233f3/manager/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.275741 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-8qftk_6fa26cab-ae65-4e21-af16-2628c86be254/kube-rbac-proxy/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.428358 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-g8lll_904a2d6e-693a-4c5e-926e-2c5fd47d6bea/kube-rbac-proxy/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.440562 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-8qftk_6fa26cab-ae65-4e21-af16-2628c86be254/manager/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.522579 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-g8lll_904a2d6e-693a-4c5e-926e-2c5fd47d6bea/manager/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.603443 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-vqxhh_ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0/kube-rbac-proxy/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.728164 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-vqxhh_ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0/manager/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.871082 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2hvxs_32830807-0fb2-4545-a629-af52b20e0b0f/kube-rbac-proxy/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.945323 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2hvxs_32830807-0fb2-4545-a629-af52b20e0b0f/manager/0.log" Sep 30 13:49:21 crc kubenswrapper[4672]: I0930 13:49:21.995583 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-xq89z_3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06/kube-rbac-proxy/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.073879 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-xq89z_3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06/manager/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.154018 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-lpv98_9aef7bd6-dab2-4333-b248-a40c44bc3743/kube-rbac-proxy/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.211848 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-lpv98_9aef7bd6-dab2-4333-b248-a40c44bc3743/manager/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.298377 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-2kvn7_d4c88a65-e12f-4872-baf2-f210ee1b0c9a/kube-rbac-proxy/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.399419 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-2kvn7_d4c88a65-e12f-4872-baf2-f210ee1b0c9a/manager/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.406037 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-hjn2l_601fcd4a-dc2f-468d-9ad6-6b173320c317/kube-rbac-proxy/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.500681 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-hjn2l_601fcd4a-dc2f-468d-9ad6-6b173320c317/manager/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.585648 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-x6x67_1b6d7bf0-00ff-41df-8873-cab7f6e5eeea/kube-rbac-proxy/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.640322 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-x6x67_1b6d7bf0-00ff-41df-8873-cab7f6e5eeea/manager/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.721141 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bd96d9cc5-cf4sl_711b218e-6a25-4d4f-b657-e621e9d1d658/kube-rbac-proxy/0.log" Sep 30 13:49:22 crc kubenswrapper[4672]: I0930 13:49:22.863643 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-b6f46bc96-v9ff6_fc7ec117-7036-452f-9b2d-894e0dd29a8f/kube-rbac-proxy/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.189542 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-b6f46bc96-v9ff6_fc7ec117-7036-452f-9b2d-894e0dd29a8f/operator/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.216515 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6rjvm_63d18a29-8d30-437c-af1b-f9cd9fa99b6b/registry-server/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.391622 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-zxb4j_5a756e54-c5bf-480b-aa89-57ca440d1ddc/kube-rbac-proxy/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.459628 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-zxb4j_5a756e54-c5bf-480b-aa89-57ca440d1ddc/manager/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.523374 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-r9sqd_42bf0afd-961a-4353-9499-a185b16b8a02/kube-rbac-proxy/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.686530 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-r9sqd_42bf0afd-961a-4353-9499-a185b16b8a02/manager/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.733627 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-qxpkk_4f1099dc-e100-44d5-8d17-255dbe0edf63/operator/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.902721 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-8mf7d_cf21b89f-fcd8-4854-954b-06927bc7c6ea/kube-rbac-proxy/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.940042 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bd96d9cc5-cf4sl_711b218e-6a25-4d4f-b657-e621e9d1d658/manager/0.log" Sep 30 13:49:23 crc kubenswrapper[4672]: I0930 13:49:23.971290 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-8mf7d_cf21b89f-fcd8-4854-954b-06927bc7c6ea/manager/0.log" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.000294 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-qxq7h_e6b8eb11-36d8-45c1-b600-76ffff076b78/kube-rbac-proxy/0.log" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.193139 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-d7gbf_74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7/kube-rbac-proxy/0.log" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.268304 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-d7gbf_74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7/manager/0.log" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.317153 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-qxq7h_e6b8eb11-36d8-45c1-b600-76ffff076b78/manager/0.log" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.398324 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58675bf858-qg9s4_4d10ceb0-c730-4fb8-b81c-a87e33890f84/kube-rbac-proxy/0.log" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.445304 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58675bf858-qg9s4_4d10ceb0-c730-4fb8-b81c-a87e33890f84/manager/0.log" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.739486 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.739548 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.739604 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.740364 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:49:24 crc kubenswrapper[4672]: I0930 13:49:24.740422 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" gracePeriod=600 Sep 30 13:49:24 crc kubenswrapper[4672]: E0930 13:49:24.862721 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:49:25 crc kubenswrapper[4672]: I0930 13:49:25.182671 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" exitCode=0 Sep 30 13:49:25 crc kubenswrapper[4672]: I0930 13:49:25.182716 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59"} Sep 30 13:49:25 crc kubenswrapper[4672]: I0930 13:49:25.182750 4672 scope.go:117] "RemoveContainer" containerID="bad48423dd84ef618fb50b14346f14c1ae1019eff77694ab2ec75f0f9f1e7b3f" Sep 30 13:49:25 crc kubenswrapper[4672]: I0930 13:49:25.183401 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:49:25 crc kubenswrapper[4672]: E0930 13:49:25.183696 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:49:37 crc kubenswrapper[4672]: I0930 13:49:37.417828 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:49:37 crc kubenswrapper[4672]: E0930 13:49:37.418743 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:49:41 crc kubenswrapper[4672]: I0930 13:49:41.816101 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dljhp_62422e21-39c6-4772-8f59-33be3d16c368/control-plane-machine-set-operator/0.log" Sep 30 13:49:41 crc kubenswrapper[4672]: I0930 13:49:41.978153 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qrbqf_a7555c95-5534-45dc-a212-4262554a0c0b/kube-rbac-proxy/0.log" Sep 30 13:49:41 crc kubenswrapper[4672]: I0930 13:49:41.980017 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qrbqf_a7555c95-5534-45dc-a212-4262554a0c0b/machine-api-operator/0.log" Sep 30 13:49:52 crc kubenswrapper[4672]: I0930 13:49:52.418436 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:49:52 crc kubenswrapper[4672]: E0930 13:49:52.419202 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:49:54 crc kubenswrapper[4672]: I0930 13:49:54.521131 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-46756_b8ab6541-957a-44c8-a773-788f725d7efb/cert-manager-controller/0.log" Sep 30 13:49:54 crc kubenswrapper[4672]: I0930 13:49:54.662584 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-zcp7d_3c4da14a-ed20-4c47-8147-2150a416c1c8/cert-manager-cainjector/0.log" Sep 30 13:49:54 crc kubenswrapper[4672]: I0930 13:49:54.738689 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-4r2wn_e53f49ff-ce7a-4699-977e-730d462910c8/cert-manager-webhook/0.log" Sep 30 13:50:06 crc kubenswrapper[4672]: I0930 13:50:06.834676 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-fphxt_8394a3ed-db36-4579-ac2b-8e3f1ce579d1/nmstate-console-plugin/0.log" Sep 30 13:50:06 crc kubenswrapper[4672]: I0930 13:50:06.993953 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-d6nbx_fa1a3970-c37a-4cdf-ba19-b868c581c02e/nmstate-handler/0.log" Sep 30 13:50:07 crc kubenswrapper[4672]: I0930 13:50:07.048411 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqkht_a767db76-9b74-4ab7-a541-5f5981850723/kube-rbac-proxy/0.log" Sep 30 13:50:07 crc kubenswrapper[4672]: I0930 13:50:07.060235 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqkht_a767db76-9b74-4ab7-a541-5f5981850723/nmstate-metrics/0.log" Sep 30 13:50:07 crc kubenswrapper[4672]: I0930 13:50:07.192498 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-qn8f5_845f56f6-18db-419a-9901-e2c4c186ad88/nmstate-operator/0.log" Sep 30 13:50:07 crc kubenswrapper[4672]: I0930 13:50:07.270527 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-4plc5_6573b5fd-c58a-4016-84ad-a21aa5622e2a/nmstate-webhook/0.log" Sep 30 13:50:07 crc kubenswrapper[4672]: I0930 13:50:07.418236 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:50:07 crc kubenswrapper[4672]: E0930 13:50:07.418583 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:50:18 crc kubenswrapper[4672]: I0930 13:50:18.416660 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:50:18 crc kubenswrapper[4672]: E0930 13:50:18.417443 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.361864 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-9d8qs_56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1/kube-rbac-proxy/0.log" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.598341 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-bpwqj_8a1bda96-fd76-4372-bb9f-ae56e6602caf/frr-k8s-webhook-server/0.log" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.634108 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-9d8qs_56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1/controller/0.log" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.782127 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.942886 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.947898 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.949051 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:50:21 crc kubenswrapper[4672]: I0930 13:50:21.950626 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.712074 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.713910 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.747878 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.749218 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.896180 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.938291 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.938302 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:50:22 crc kubenswrapper[4672]: I0930 13:50:22.959040 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/controller/0.log" Sep 30 13:50:23 crc kubenswrapper[4672]: I0930 13:50:23.102341 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/frr-metrics/0.log" Sep 30 13:50:23 crc kubenswrapper[4672]: I0930 13:50:23.122774 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/kube-rbac-proxy-frr/0.log" Sep 30 13:50:23 crc kubenswrapper[4672]: I0930 13:50:23.155628 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/kube-rbac-proxy/0.log" Sep 30 13:50:23 crc kubenswrapper[4672]: I0930 13:50:23.324072 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/reloader/0.log" Sep 30 13:50:23 crc kubenswrapper[4672]: I0930 13:50:23.378106 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-879d84ff8-vhxdg_2831267a-a276-41c3-afaf-c262071b60c7/manager/0.log" Sep 30 13:50:23 crc kubenswrapper[4672]: I0930 13:50:23.536627 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5dddf5dfdb-xtn27_bbcc3167-a28b-47c0-93a5-cab38ea7d13b/webhook-server/0.log" Sep 30 13:50:23 crc kubenswrapper[4672]: I0930 13:50:23.750701 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v66pl_07eaf50f-6d5e-4e3e-8c3d-1e28769bae68/kube-rbac-proxy/0.log" Sep 30 13:50:24 crc kubenswrapper[4672]: I0930 13:50:24.283193 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v66pl_07eaf50f-6d5e-4e3e-8c3d-1e28769bae68/speaker/0.log" Sep 30 13:50:24 crc kubenswrapper[4672]: I0930 13:50:24.806455 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/frr/0.log" Sep 30 13:50:29 crc kubenswrapper[4672]: I0930 13:50:29.423201 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:50:29 crc kubenswrapper[4672]: E0930 13:50:29.423920 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:50:35 crc kubenswrapper[4672]: I0930 13:50:35.879640 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/util/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.068076 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/util/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.085900 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/pull/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.115444 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/pull/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.268536 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/util/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.285870 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/pull/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.294398 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/extract/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.453919 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/util/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.603843 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/pull/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.620761 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/util/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.624641 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/pull/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.780829 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/extract/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.800213 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/pull/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.834209 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/util/0.log" Sep 30 13:50:36 crc kubenswrapper[4672]: I0930 13:50:36.964598 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-utilities/0.log" Sep 30 13:50:37 crc kubenswrapper[4672]: I0930 13:50:37.102820 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-utilities/0.log" Sep 30 13:50:37 crc kubenswrapper[4672]: I0930 13:50:37.125031 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-content/0.log" Sep 30 13:50:37 crc kubenswrapper[4672]: I0930 13:50:37.144833 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-content/0.log" Sep 30 13:50:37 crc kubenswrapper[4672]: I0930 13:50:37.332722 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-content/0.log" Sep 30 13:50:37 crc kubenswrapper[4672]: I0930 13:50:37.394059 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-utilities/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.150945 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/registry-server/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.202687 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-utilities/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.345152 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-content/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.353698 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-content/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.357661 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-utilities/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.510634 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-content/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.514336 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-utilities/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.728361 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/util/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.945677 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/pull/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.952183 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/pull/0.log" Sep 30 13:50:38 crc kubenswrapper[4672]: I0930 13:50:38.984133 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/util/0.log" Sep 30 13:50:39 crc kubenswrapper[4672]: I0930 13:50:39.066298 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/registry-server/0.log" Sep 30 13:50:39 crc kubenswrapper[4672]: I0930 13:50:39.196867 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/util/0.log" Sep 30 13:50:39 crc kubenswrapper[4672]: I0930 13:50:39.206150 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/pull/0.log" Sep 30 13:50:39 crc kubenswrapper[4672]: I0930 13:50:39.206182 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/extract/0.log" Sep 30 13:50:39 crc kubenswrapper[4672]: I0930 13:50:39.857670 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kxjq6_08b96597-cb4d-4c38-9557-d60b937ab2c7/marketplace-operator/0.log" Sep 30 13:50:39 crc kubenswrapper[4672]: I0930 13:50:39.993855 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-utilities/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.151162 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-utilities/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.155852 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-content/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.187436 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-content/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.329235 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-content/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.334071 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-utilities/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.366036 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-utilities/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.586603 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-content/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.623751 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/registry-server/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.624742 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-utilities/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.625876 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-content/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.765594 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-utilities/0.log" Sep 30 13:50:40 crc kubenswrapper[4672]: I0930 13:50:40.779646 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-content/0.log" Sep 30 13:50:41 crc kubenswrapper[4672]: I0930 13:50:41.460376 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/registry-server/0.log" Sep 30 13:50:42 crc kubenswrapper[4672]: I0930 13:50:42.416899 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:50:42 crc kubenswrapper[4672]: E0930 13:50:42.417440 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:50:52 crc kubenswrapper[4672]: I0930 13:50:52.015483 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-k7drs_cf46fa05-32de-4c26-82e7-769052afcaa1/prometheus-operator/0.log" Sep 30 13:50:52 crc kubenswrapper[4672]: I0930 13:50:52.171771 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq_95eba142-c439-4920-914e-af904642acc2/prometheus-operator-admission-webhook/0.log" Sep 30 13:50:52 crc kubenswrapper[4672]: I0930 13:50:52.203734 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944_eaec0e45-413d-4fad-a35b-68a28486053a/prometheus-operator-admission-webhook/0.log" Sep 30 13:50:52 crc kubenswrapper[4672]: I0930 13:50:52.358435 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-fzmsr_a4a3a18a-31ce-496c-b863-bdc8ff9774cb/operator/0.log" Sep 30 13:50:52 crc kubenswrapper[4672]: I0930 13:50:52.409511 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-27rgt_961561c7-4ef8-4592-bb9a-53ef762e38ea/perses-operator/0.log" Sep 30 13:50:55 crc kubenswrapper[4672]: I0930 13:50:55.417612 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:50:55 crc kubenswrapper[4672]: E0930 13:50:55.418521 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:51:08 crc kubenswrapper[4672]: I0930 13:51:08.416917 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:51:08 crc kubenswrapper[4672]: E0930 13:51:08.417591 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:51:20 crc kubenswrapper[4672]: I0930 13:51:20.419323 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:51:20 crc kubenswrapper[4672]: E0930 13:51:20.421785 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:51:32 crc kubenswrapper[4672]: I0930 13:51:32.419490 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:51:32 crc kubenswrapper[4672]: E0930 13:51:32.420780 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:51:43 crc kubenswrapper[4672]: I0930 13:51:43.417420 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:51:43 crc kubenswrapper[4672]: E0930 13:51:43.418368 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:51:58 crc kubenswrapper[4672]: I0930 13:51:58.419448 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:51:58 crc kubenswrapper[4672]: E0930 13:51:58.420194 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:52:12 crc kubenswrapper[4672]: I0930 13:52:12.418723 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:52:12 crc kubenswrapper[4672]: E0930 13:52:12.419741 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:52:24 crc kubenswrapper[4672]: I0930 13:52:24.417635 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:52:24 crc kubenswrapper[4672]: E0930 13:52:24.418468 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:52:35 crc kubenswrapper[4672]: I0930 13:52:35.417397 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:52:35 crc kubenswrapper[4672]: E0930 13:52:35.418243 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.071413 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4tld8"] Sep 30 13:52:45 crc kubenswrapper[4672]: E0930 13:52:45.072524 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bda37ea-464e-48cb-98dd-342144aadb43" containerName="container-00" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.072542 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bda37ea-464e-48cb-98dd-342144aadb43" containerName="container-00" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.072848 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bda37ea-464e-48cb-98dd-342144aadb43" containerName="container-00" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.075166 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.082821 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4tld8"] Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.244579 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7lkf\" (UniqueName: \"kubernetes.io/projected/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-kube-api-access-p7lkf\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.244718 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-utilities\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.244819 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-catalog-content\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.346655 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-utilities\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.346768 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-catalog-content\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.346819 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7lkf\" (UniqueName: \"kubernetes.io/projected/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-kube-api-access-p7lkf\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.347228 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-utilities\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.347322 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-catalog-content\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.368667 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7lkf\" (UniqueName: \"kubernetes.io/projected/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-kube-api-access-p7lkf\") pod \"redhat-operators-4tld8\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.404362 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:45 crc kubenswrapper[4672]: I0930 13:52:45.889871 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4tld8"] Sep 30 13:52:46 crc kubenswrapper[4672]: I0930 13:52:46.207816 4672 generic.go:334] "Generic (PLEG): container finished" podID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerID="b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a" exitCode=0 Sep 30 13:52:46 crc kubenswrapper[4672]: I0930 13:52:46.207860 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tld8" event={"ID":"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7","Type":"ContainerDied","Data":"b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a"} Sep 30 13:52:46 crc kubenswrapper[4672]: I0930 13:52:46.207891 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tld8" event={"ID":"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7","Type":"ContainerStarted","Data":"68ddd8331b5ff12ea8d68363482d1b9d873bd21136729664bbe233d0d3ec4f3b"} Sep 30 13:52:47 crc kubenswrapper[4672]: I0930 13:52:47.416854 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:52:47 crc kubenswrapper[4672]: E0930 13:52:47.417598 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:52:48 crc kubenswrapper[4672]: I0930 13:52:48.247349 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tld8" event={"ID":"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7","Type":"ContainerDied","Data":"3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1"} Sep 30 13:52:48 crc kubenswrapper[4672]: I0930 13:52:48.247207 4672 generic.go:334] "Generic (PLEG): container finished" podID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerID="3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1" exitCode=0 Sep 30 13:52:50 crc kubenswrapper[4672]: I0930 13:52:50.274880 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tld8" event={"ID":"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7","Type":"ContainerStarted","Data":"1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9"} Sep 30 13:52:55 crc kubenswrapper[4672]: I0930 13:52:55.405388 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:55 crc kubenswrapper[4672]: I0930 13:52:55.406023 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:55 crc kubenswrapper[4672]: I0930 13:52:55.472100 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:55 crc kubenswrapper[4672]: I0930 13:52:55.490290 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4tld8" podStartSLOduration=7.883909673 podStartE2EDuration="10.490256539s" podCreationTimestamp="2025-09-30 13:52:45 +0000 UTC" firstStartedPulling="2025-09-30 13:52:46.209832809 +0000 UTC m=+5457.479070455" lastFinishedPulling="2025-09-30 13:52:48.816179635 +0000 UTC m=+5460.085417321" observedRunningTime="2025-09-30 13:52:50.299046414 +0000 UTC m=+5461.568284060" watchObservedRunningTime="2025-09-30 13:52:55.490256539 +0000 UTC m=+5466.759494185" Sep 30 13:52:56 crc kubenswrapper[4672]: I0930 13:52:56.407852 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:56 crc kubenswrapper[4672]: I0930 13:52:56.456287 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4tld8"] Sep 30 13:52:58 crc kubenswrapper[4672]: I0930 13:52:58.372529 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4tld8" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="registry-server" containerID="cri-o://1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9" gracePeriod=2 Sep 30 13:52:58 crc kubenswrapper[4672]: I0930 13:52:58.818920 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:58 crc kubenswrapper[4672]: I0930 13:52:58.942099 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-utilities\") pod \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " Sep 30 13:52:58 crc kubenswrapper[4672]: I0930 13:52:58.942253 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7lkf\" (UniqueName: \"kubernetes.io/projected/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-kube-api-access-p7lkf\") pod \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " Sep 30 13:52:58 crc kubenswrapper[4672]: I0930 13:52:58.942348 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-catalog-content\") pod \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\" (UID: \"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7\") " Sep 30 13:52:58 crc kubenswrapper[4672]: I0930 13:52:58.944093 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-utilities" (OuterVolumeSpecName: "utilities") pod "aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" (UID: "aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:58 crc kubenswrapper[4672]: I0930 13:52:58.947891 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-kube-api-access-p7lkf" (OuterVolumeSpecName: "kube-api-access-p7lkf") pod "aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" (UID: "aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7"). InnerVolumeSpecName "kube-api-access-p7lkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.024603 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" (UID: "aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.045942 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.045986 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7lkf\" (UniqueName: \"kubernetes.io/projected/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-kube-api-access-p7lkf\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.046000 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.383351 4672 generic.go:334] "Generic (PLEG): container finished" podID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerID="1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9" exitCode=0 Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.383419 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tld8" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.383461 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tld8" event={"ID":"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7","Type":"ContainerDied","Data":"1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9"} Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.383774 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tld8" event={"ID":"aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7","Type":"ContainerDied","Data":"68ddd8331b5ff12ea8d68363482d1b9d873bd21136729664bbe233d0d3ec4f3b"} Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.383792 4672 scope.go:117] "RemoveContainer" containerID="1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.412685 4672 scope.go:117] "RemoveContainer" containerID="3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.431431 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4tld8"] Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.431463 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4tld8"] Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.437037 4672 scope.go:117] "RemoveContainer" containerID="b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.478395 4672 scope.go:117] "RemoveContainer" containerID="1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9" Sep 30 13:52:59 crc kubenswrapper[4672]: E0930 13:52:59.478879 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9\": container with ID starting with 1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9 not found: ID does not exist" containerID="1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.478910 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9"} err="failed to get container status \"1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9\": rpc error: code = NotFound desc = could not find container \"1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9\": container with ID starting with 1d6916692968aebd8f57ac21eee3504ba6803ee0ed323ce0ca507401d54467f9 not found: ID does not exist" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.478928 4672 scope.go:117] "RemoveContainer" containerID="3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1" Sep 30 13:52:59 crc kubenswrapper[4672]: E0930 13:52:59.479212 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1\": container with ID starting with 3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1 not found: ID does not exist" containerID="3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.479235 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1"} err="failed to get container status \"3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1\": rpc error: code = NotFound desc = could not find container \"3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1\": container with ID starting with 3aaa8d8139b27b2e2afad01edb82f4fc127d7613ac60e49b1d7a3aa9ba6113d1 not found: ID does not exist" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.479248 4672 scope.go:117] "RemoveContainer" containerID="b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a" Sep 30 13:52:59 crc kubenswrapper[4672]: E0930 13:52:59.479507 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a\": container with ID starting with b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a not found: ID does not exist" containerID="b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a" Sep 30 13:52:59 crc kubenswrapper[4672]: I0930 13:52:59.479547 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a"} err="failed to get container status \"b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a\": rpc error: code = NotFound desc = could not find container \"b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a\": container with ID starting with b1eafc63879b5dd3dbbd4dd289f2074012d145db49ed4d3902d763cd640c8b7a not found: ID does not exist" Sep 30 13:53:01 crc kubenswrapper[4672]: I0930 13:53:01.418157 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:53:01 crc kubenswrapper[4672]: E0930 13:53:01.418791 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:53:01 crc kubenswrapper[4672]: I0930 13:53:01.442355 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" path="/var/lib/kubelet/pods/aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7/volumes" Sep 30 13:53:11 crc kubenswrapper[4672]: I0930 13:53:11.511813 4672 generic.go:334] "Generic (PLEG): container finished" podID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerID="fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992" exitCode=0 Sep 30 13:53:11 crc kubenswrapper[4672]: I0930 13:53:11.511894 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" event={"ID":"fac7df9f-5203-4273-85cb-fdfdbcb34766","Type":"ContainerDied","Data":"fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992"} Sep 30 13:53:11 crc kubenswrapper[4672]: I0930 13:53:11.512980 4672 scope.go:117] "RemoveContainer" containerID="fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992" Sep 30 13:53:12 crc kubenswrapper[4672]: I0930 13:53:12.383468 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jjqwv_must-gather-g8nrf_fac7df9f-5203-4273-85cb-fdfdbcb34766/gather/0.log" Sep 30 13:53:13 crc kubenswrapper[4672]: I0930 13:53:13.417013 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:53:13 crc kubenswrapper[4672]: E0930 13:53:13.417601 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:53:21 crc kubenswrapper[4672]: I0930 13:53:21.784441 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jjqwv/must-gather-g8nrf"] Sep 30 13:53:21 crc kubenswrapper[4672]: I0930 13:53:21.785188 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerName="copy" containerID="cri-o://d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4" gracePeriod=2 Sep 30 13:53:21 crc kubenswrapper[4672]: I0930 13:53:21.794146 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jjqwv/must-gather-g8nrf"] Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.273870 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jjqwv_must-gather-g8nrf_fac7df9f-5203-4273-85cb-fdfdbcb34766/copy/0.log" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.274820 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.444367 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fac7df9f-5203-4273-85cb-fdfdbcb34766-must-gather-output\") pod \"fac7df9f-5203-4273-85cb-fdfdbcb34766\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.444472 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv5c4\" (UniqueName: \"kubernetes.io/projected/fac7df9f-5203-4273-85cb-fdfdbcb34766-kube-api-access-cv5c4\") pod \"fac7df9f-5203-4273-85cb-fdfdbcb34766\" (UID: \"fac7df9f-5203-4273-85cb-fdfdbcb34766\") " Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.450025 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac7df9f-5203-4273-85cb-fdfdbcb34766-kube-api-access-cv5c4" (OuterVolumeSpecName: "kube-api-access-cv5c4") pod "fac7df9f-5203-4273-85cb-fdfdbcb34766" (UID: "fac7df9f-5203-4273-85cb-fdfdbcb34766"). InnerVolumeSpecName "kube-api-access-cv5c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.546595 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv5c4\" (UniqueName: \"kubernetes.io/projected/fac7df9f-5203-4273-85cb-fdfdbcb34766-kube-api-access-cv5c4\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.639347 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fac7df9f-5203-4273-85cb-fdfdbcb34766-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fac7df9f-5203-4273-85cb-fdfdbcb34766" (UID: "fac7df9f-5203-4273-85cb-fdfdbcb34766"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.648443 4672 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fac7df9f-5203-4273-85cb-fdfdbcb34766-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.651356 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jjqwv_must-gather-g8nrf_fac7df9f-5203-4273-85cb-fdfdbcb34766/copy/0.log" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.653183 4672 generic.go:334] "Generic (PLEG): container finished" podID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerID="d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4" exitCode=143 Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.653248 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjqwv/must-gather-g8nrf" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.653290 4672 scope.go:117] "RemoveContainer" containerID="d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.692551 4672 scope.go:117] "RemoveContainer" containerID="fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.740238 4672 scope.go:117] "RemoveContainer" containerID="d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4" Sep 30 13:53:22 crc kubenswrapper[4672]: E0930 13:53:22.741025 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4\": container with ID starting with d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4 not found: ID does not exist" containerID="d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.741075 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4"} err="failed to get container status \"d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4\": rpc error: code = NotFound desc = could not find container \"d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4\": container with ID starting with d6896d120aa9f441fb789562392aed1b1c2ada215ee84bd1ccc9fb97781697f4 not found: ID does not exist" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.741102 4672 scope.go:117] "RemoveContainer" containerID="fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992" Sep 30 13:53:22 crc kubenswrapper[4672]: E0930 13:53:22.741981 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992\": container with ID starting with fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992 not found: ID does not exist" containerID="fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992" Sep 30 13:53:22 crc kubenswrapper[4672]: I0930 13:53:22.742020 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992"} err="failed to get container status \"fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992\": rpc error: code = NotFound desc = could not find container \"fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992\": container with ID starting with fb817ff69bc93f7aee4d4cfe9c5f365f57016825dd1dca2b8b59d710518e5992 not found: ID does not exist" Sep 30 13:53:23 crc kubenswrapper[4672]: I0930 13:53:23.431405 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" path="/var/lib/kubelet/pods/fac7df9f-5203-4273-85cb-fdfdbcb34766/volumes" Sep 30 13:53:28 crc kubenswrapper[4672]: I0930 13:53:28.417581 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:53:28 crc kubenswrapper[4672]: E0930 13:53:28.418204 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:53:43 crc kubenswrapper[4672]: I0930 13:53:43.417987 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:53:43 crc kubenswrapper[4672]: E0930 13:53:43.419357 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:53:55 crc kubenswrapper[4672]: I0930 13:53:55.417458 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:53:55 crc kubenswrapper[4672]: E0930 13:53:55.418529 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.303409 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9cj4/must-gather-vcd5s"] Sep 30 13:54:09 crc kubenswrapper[4672]: E0930 13:54:09.304539 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="registry-server" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304557 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="registry-server" Sep 30 13:54:09 crc kubenswrapper[4672]: E0930 13:54:09.304580 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerName="copy" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304587 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerName="copy" Sep 30 13:54:09 crc kubenswrapper[4672]: E0930 13:54:09.304607 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="extract-content" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304615 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="extract-content" Sep 30 13:54:09 crc kubenswrapper[4672]: E0930 13:54:09.304634 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerName="gather" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304641 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerName="gather" Sep 30 13:54:09 crc kubenswrapper[4672]: E0930 13:54:09.304676 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="extract-utilities" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304686 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="extract-utilities" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304912 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerName="gather" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304939 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8affe9-0a0d-4bd9-8052-24ba3eccc8d7" containerName="registry-server" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.304956 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac7df9f-5203-4273-85cb-fdfdbcb34766" containerName="copy" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.306410 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.309809 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b9cj4"/"default-dockercfg-rdpzz" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.309847 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b9cj4"/"openshift-service-ca.crt" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.310195 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b9cj4"/"kube-root-ca.crt" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.344360 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b9cj4/must-gather-vcd5s"] Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.432687 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:54:09 crc kubenswrapper[4672]: E0930 13:54:09.433671 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.468587 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23d735d5-5887-469a-a9a7-973647e56895-must-gather-output\") pod \"must-gather-vcd5s\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.468654 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76sj4\" (UniqueName: \"kubernetes.io/projected/23d735d5-5887-469a-a9a7-973647e56895-kube-api-access-76sj4\") pod \"must-gather-vcd5s\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.571202 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23d735d5-5887-469a-a9a7-973647e56895-must-gather-output\") pod \"must-gather-vcd5s\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.571309 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76sj4\" (UniqueName: \"kubernetes.io/projected/23d735d5-5887-469a-a9a7-973647e56895-kube-api-access-76sj4\") pod \"must-gather-vcd5s\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.571955 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23d735d5-5887-469a-a9a7-973647e56895-must-gather-output\") pod \"must-gather-vcd5s\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.592995 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76sj4\" (UniqueName: \"kubernetes.io/projected/23d735d5-5887-469a-a9a7-973647e56895-kube-api-access-76sj4\") pod \"must-gather-vcd5s\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:09 crc kubenswrapper[4672]: I0930 13:54:09.639423 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 13:54:10 crc kubenswrapper[4672]: I0930 13:54:10.160115 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b9cj4/must-gather-vcd5s"] Sep 30 13:54:11 crc kubenswrapper[4672]: I0930 13:54:11.139409 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" event={"ID":"23d735d5-5887-469a-a9a7-973647e56895","Type":"ContainerStarted","Data":"21fa94d1b1be4488474e38c75cb7d9b08aa62a6bf64addce0e81cf0b0cf7adc8"} Sep 30 13:54:11 crc kubenswrapper[4672]: I0930 13:54:11.139746 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" event={"ID":"23d735d5-5887-469a-a9a7-973647e56895","Type":"ContainerStarted","Data":"ea28a837405ac8035545c0a1cdf20d8a6d61a0b2f996f21de521780e5a8fa3cb"} Sep 30 13:54:11 crc kubenswrapper[4672]: I0930 13:54:11.139761 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" event={"ID":"23d735d5-5887-469a-a9a7-973647e56895","Type":"ContainerStarted","Data":"3e54914fd5f607f8b9952243056b94746a7f657a31916ed3c73953cb758602c3"} Sep 30 13:54:11 crc kubenswrapper[4672]: I0930 13:54:11.168529 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" podStartSLOduration=2.168506042 podStartE2EDuration="2.168506042s" podCreationTimestamp="2025-09-30 13:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:11.15624604 +0000 UTC m=+5542.425483706" watchObservedRunningTime="2025-09-30 13:54:11.168506042 +0000 UTC m=+5542.437743708" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.251471 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-b4b99"] Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.253699 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.372204 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhhlh\" (UniqueName: \"kubernetes.io/projected/7395867b-6ba9-498f-80a2-c10013643700-kube-api-access-rhhlh\") pod \"crc-debug-b4b99\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.372297 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7395867b-6ba9-498f-80a2-c10013643700-host\") pod \"crc-debug-b4b99\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.473567 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7395867b-6ba9-498f-80a2-c10013643700-host\") pod \"crc-debug-b4b99\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.473764 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhhlh\" (UniqueName: \"kubernetes.io/projected/7395867b-6ba9-498f-80a2-c10013643700-kube-api-access-rhhlh\") pod \"crc-debug-b4b99\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.473789 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7395867b-6ba9-498f-80a2-c10013643700-host\") pod \"crc-debug-b4b99\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.497182 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhhlh\" (UniqueName: \"kubernetes.io/projected/7395867b-6ba9-498f-80a2-c10013643700-kube-api-access-rhhlh\") pod \"crc-debug-b4b99\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: I0930 13:54:14.572323 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:54:14 crc kubenswrapper[4672]: W0930 13:54:14.607542 4672 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7395867b_6ba9_498f_80a2_c10013643700.slice/crio-b0386d3f8b7d9172c203f374bf2f84ba91453302cdcf7e886d01cb22fc684bcf WatchSource:0}: Error finding container b0386d3f8b7d9172c203f374bf2f84ba91453302cdcf7e886d01cb22fc684bcf: Status 404 returned error can't find the container with id b0386d3f8b7d9172c203f374bf2f84ba91453302cdcf7e886d01cb22fc684bcf Sep 30 13:54:15 crc kubenswrapper[4672]: I0930 13:54:15.193001 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" event={"ID":"7395867b-6ba9-498f-80a2-c10013643700","Type":"ContainerStarted","Data":"74bc5b5cac8663255bf1ba8eba84b26e1bf395650c324e35b6273373df14c147"} Sep 30 13:54:15 crc kubenswrapper[4672]: I0930 13:54:15.193539 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" event={"ID":"7395867b-6ba9-498f-80a2-c10013643700","Type":"ContainerStarted","Data":"b0386d3f8b7d9172c203f374bf2f84ba91453302cdcf7e886d01cb22fc684bcf"} Sep 30 13:54:15 crc kubenswrapper[4672]: I0930 13:54:15.218999 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" podStartSLOduration=1.21897893 podStartE2EDuration="1.21897893s" podCreationTimestamp="2025-09-30 13:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:54:15.212844794 +0000 UTC m=+5546.482082440" watchObservedRunningTime="2025-09-30 13:54:15.21897893 +0000 UTC m=+5546.488216576" Sep 30 13:54:24 crc kubenswrapper[4672]: I0930 13:54:24.417459 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:54:24 crc kubenswrapper[4672]: E0930 13:54:24.418251 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 13:54:39 crc kubenswrapper[4672]: I0930 13:54:39.424187 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:54:40 crc kubenswrapper[4672]: I0930 13:54:40.424529 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"966f5125052c74b4e7c550103d16e6fa385f49fb5552ce6b0019712558a5916e"} Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.253630 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ptvfl"] Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.257097 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.271361 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ptvfl"] Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.314901 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7ll\" (UniqueName: \"kubernetes.io/projected/cc611d86-5c28-48fc-b837-2971f2056398-kube-api-access-6s7ll\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.314947 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-utilities\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.314976 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-catalog-content\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.418027 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7ll\" (UniqueName: \"kubernetes.io/projected/cc611d86-5c28-48fc-b837-2971f2056398-kube-api-access-6s7ll\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.418088 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-utilities\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.418120 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-catalog-content\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.418748 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-catalog-content\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.420014 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-utilities\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.446063 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7ll\" (UniqueName: \"kubernetes.io/projected/cc611d86-5c28-48fc-b837-2971f2056398-kube-api-access-6s7ll\") pod \"certified-operators-ptvfl\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:22 crc kubenswrapper[4672]: I0930 13:55:22.643818 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:23 crc kubenswrapper[4672]: I0930 13:55:23.306636 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ptvfl"] Sep 30 13:55:23 crc kubenswrapper[4672]: I0930 13:55:23.873721 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptvfl" event={"ID":"cc611d86-5c28-48fc-b837-2971f2056398","Type":"ContainerStarted","Data":"49bd91411ae9277c08a97b1ee3a5a5a999b442898881db75dd5b8dd27c440bab"} Sep 30 13:55:24 crc kubenswrapper[4672]: I0930 13:55:24.888162 4672 generic.go:334] "Generic (PLEG): container finished" podID="cc611d86-5c28-48fc-b837-2971f2056398" containerID="75825a39fcf305ff309bcf2868873db3034d5f3a01cea1af7bc62b2e40406560" exitCode=0 Sep 30 13:55:24 crc kubenswrapper[4672]: I0930 13:55:24.888308 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptvfl" event={"ID":"cc611d86-5c28-48fc-b837-2971f2056398","Type":"ContainerDied","Data":"75825a39fcf305ff309bcf2868873db3034d5f3a01cea1af7bc62b2e40406560"} Sep 30 13:55:24 crc kubenswrapper[4672]: I0930 13:55:24.891734 4672 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 13:55:27 crc kubenswrapper[4672]: I0930 13:55:27.931224 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptvfl" event={"ID":"cc611d86-5c28-48fc-b837-2971f2056398","Type":"ContainerStarted","Data":"9fbaed5e0cd7a4f547e6f7ed9f6d4a00e978537ac6f5891193b4b28c828b7c1e"} Sep 30 13:55:30 crc kubenswrapper[4672]: I0930 13:55:30.976519 4672 generic.go:334] "Generic (PLEG): container finished" podID="cc611d86-5c28-48fc-b837-2971f2056398" containerID="9fbaed5e0cd7a4f547e6f7ed9f6d4a00e978537ac6f5891193b4b28c828b7c1e" exitCode=0 Sep 30 13:55:30 crc kubenswrapper[4672]: I0930 13:55:30.976703 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptvfl" event={"ID":"cc611d86-5c28-48fc-b837-2971f2056398","Type":"ContainerDied","Data":"9fbaed5e0cd7a4f547e6f7ed9f6d4a00e978537ac6f5891193b4b28c828b7c1e"} Sep 30 13:55:31 crc kubenswrapper[4672]: I0930 13:55:31.989338 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptvfl" event={"ID":"cc611d86-5c28-48fc-b837-2971f2056398","Type":"ContainerStarted","Data":"05278a6ce0cacb6db6f6f53f98ff99dc995d43edfd9ad820a2b2a983bfea0df5"} Sep 30 13:55:32 crc kubenswrapper[4672]: I0930 13:55:32.013879 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ptvfl" podStartSLOduration=3.305810164 podStartE2EDuration="10.013857023s" podCreationTimestamp="2025-09-30 13:55:22 +0000 UTC" firstStartedPulling="2025-09-30 13:55:24.891424563 +0000 UTC m=+5616.160662209" lastFinishedPulling="2025-09-30 13:55:31.599471422 +0000 UTC m=+5622.868709068" observedRunningTime="2025-09-30 13:55:32.006606038 +0000 UTC m=+5623.275843694" watchObservedRunningTime="2025-09-30 13:55:32.013857023 +0000 UTC m=+5623.283094679" Sep 30 13:55:32 crc kubenswrapper[4672]: I0930 13:55:32.662506 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:32 crc kubenswrapper[4672]: I0930 13:55:32.662551 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:33 crc kubenswrapper[4672]: I0930 13:55:33.738562 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ptvfl" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="registry-server" probeResult="failure" output=< Sep 30 13:55:33 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Sep 30 13:55:33 crc kubenswrapper[4672]: > Sep 30 13:55:35 crc kubenswrapper[4672]: I0930 13:55:35.233536 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c888d4d9d-4nr45_7b67355f-8081-4014-ad68-6e31faa794b1/barbican-api/0.log" Sep 30 13:55:35 crc kubenswrapper[4672]: I0930 13:55:35.242933 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c888d4d9d-4nr45_7b67355f-8081-4014-ad68-6e31faa794b1/barbican-api-log/0.log" Sep 30 13:55:35 crc kubenswrapper[4672]: I0930 13:55:35.446669 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-654fbcfdf6-vvhwm_7212b048-8992-4678-b985-05a5c1fc8818/barbican-keystone-listener/0.log" Sep 30 13:55:35 crc kubenswrapper[4672]: I0930 13:55:35.502903 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-654fbcfdf6-vvhwm_7212b048-8992-4678-b985-05a5c1fc8818/barbican-keystone-listener-log/0.log" Sep 30 13:55:35 crc kubenswrapper[4672]: I0930 13:55:35.620858 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bd7cdf79c-ddf9q_e338321e-04aa-4ed3-8b3c-0baf6888f64f/barbican-worker/0.log" Sep 30 13:55:35 crc kubenswrapper[4672]: I0930 13:55:35.721387 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bd7cdf79c-ddf9q_e338321e-04aa-4ed3-8b3c-0baf6888f64f/barbican-worker-log/0.log" Sep 30 13:55:35 crc kubenswrapper[4672]: I0930 13:55:35.953046 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cksj8_91de1b76-2b84-4d21-9683-d7aee98fb876/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.119567 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/ceilometer-central-agent/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.171915 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/ceilometer-notification-agent/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.183450 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/proxy-httpd/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.311190 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_daa9e3a4-cb9a-428e-8a81-2ed6d9e1c461/sg-core/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.592287 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd/cinder-api-log/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.726516 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dca6226f-d8f3-4d33-9d30-20bb6cf8a3fd/cinder-api/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.858832 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3140cbea-70fb-4d82-90d3-fa12c43fcf76/cinder-scheduler/0.log" Sep 30 13:55:36 crc kubenswrapper[4672]: I0930 13:55:36.918024 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3140cbea-70fb-4d82-90d3-fa12c43fcf76/probe/0.log" Sep 30 13:55:37 crc kubenswrapper[4672]: I0930 13:55:37.082751 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kft7n_6354da04-65da-4562-9e78-563e1fb4f4fe/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:37 crc kubenswrapper[4672]: I0930 13:55:37.285525 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-29qlm_847b8779-d63c-4bbb-9b51-94a2c102e36d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:37 crc kubenswrapper[4672]: I0930 13:55:37.395679 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kxjhd_29628904-dd3c-4ce7-a114-552159673def/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:37 crc kubenswrapper[4672]: I0930 13:55:37.571598 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64bcc76c55-fkj7b_cdfacf2c-b616-4c40-b16e-ec39de0d0e21/init/0.log" Sep 30 13:55:37 crc kubenswrapper[4672]: I0930 13:55:37.891367 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64bcc76c55-fkj7b_cdfacf2c-b616-4c40-b16e-ec39de0d0e21/init/0.log" Sep 30 13:55:37 crc kubenswrapper[4672]: I0930 13:55:37.924438 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64bcc76c55-fkj7b_cdfacf2c-b616-4c40-b16e-ec39de0d0e21/dnsmasq-dns/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.136865 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vl4f8_a88c5cde-cba5-457b-8044-77ed9db4c080/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.160196 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d7983763-9bc4-4528-9cf4-f2d693c42c5f/glance-httpd/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.314417 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d7983763-9bc4-4528-9cf4-f2d693c42c5f/glance-log/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.364164 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a/glance-httpd/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.410059 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b221aeb1-8ec1-4805-9f6f-174a3a9ecd8a/glance-log/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.693469 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-844b6c9474-6tpzt_2659b35e-ecb1-416b-8a94-690759645536/horizon/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.770319 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qjc69_e44e3b27-d209-49da-93b2-ed646da0650e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:38 crc kubenswrapper[4672]: I0930 13:55:38.932092 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-669k5_5e4bf526-356e-4b1f-a69e-7da92365808d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:39 crc kubenswrapper[4672]: I0930 13:55:39.253979 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-844b6c9474-6tpzt_2659b35e-ecb1-416b-8a94-690759645536/horizon-log/0.log" Sep 30 13:55:39 crc kubenswrapper[4672]: I0930 13:55:39.310760 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320621-r2bv7_a0837a49-9f57-447b-8da5-feef49bf42f0/keystone-cron/0.log" Sep 30 13:55:39 crc kubenswrapper[4672]: I0930 13:55:39.468558 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_32acef5a-c440-4574-9a53-18754f15acc6/kube-state-metrics/0.log" Sep 30 13:55:39 crc kubenswrapper[4672]: I0930 13:55:39.600544 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c5bf9886d-9nhb9_1871c14e-9602-478e-888f-31d273376456/keystone-api/0.log" Sep 30 13:55:39 crc kubenswrapper[4672]: I0930 13:55:39.749637 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-zq47z_883bcbaa-0233-4f4d-8463-f451155bc618/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:40 crc kubenswrapper[4672]: I0930 13:55:40.236976 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7c4888f-v7vn2_f394ad91-f6fb-4d7a-8508-d8fede494686/neutron-httpd/0.log" Sep 30 13:55:40 crc kubenswrapper[4672]: I0930 13:55:40.290138 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b7c4888f-v7vn2_f394ad91-f6fb-4d7a-8508-d8fede494686/neutron-api/0.log" Sep 30 13:55:40 crc kubenswrapper[4672]: I0930 13:55:40.404749 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-4fh64_ebb7b5d2-78ee-459d-a8c2-4a5ffb193df6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:40 crc kubenswrapper[4672]: I0930 13:55:40.973551 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_34e59a30-14b8-4736-87b1-9d9581094598/memcached/0.log" Sep 30 13:55:41 crc kubenswrapper[4672]: I0930 13:55:41.303190 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fdadbc89-4050-4b7f-bf2b-70e405b18974/nova-cell0-conductor-conductor/0.log" Sep 30 13:55:41 crc kubenswrapper[4672]: I0930 13:55:41.641433 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5017308f-acf6-406c-8c75-3f6b550f8190/nova-cell1-conductor-conductor/0.log" Sep 30 13:55:41 crc kubenswrapper[4672]: I0930 13:55:41.858134 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_77091ef9-bf9b-4b0b-aacd-c46a576974a8/nova-api-log/0.log" Sep 30 13:55:41 crc kubenswrapper[4672]: I0930 13:55:41.927893 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_92487d22-391c-44e2-8179-1e523ab07026/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 13:55:42 crc kubenswrapper[4672]: I0930 13:55:42.136917 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qmh4q_727d1f8a-6b85-4184-b669-3fe8b94c608a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:42 crc kubenswrapper[4672]: I0930 13:55:42.236820 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_db8c9818-e7bc-471f-b7d6-b097f3657451/nova-metadata-log/0.log" Sep 30 13:55:42 crc kubenswrapper[4672]: I0930 13:55:42.280973 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_77091ef9-bf9b-4b0b-aacd-c46a576974a8/nova-api-api/0.log" Sep 30 13:55:42 crc kubenswrapper[4672]: I0930 13:55:42.771416 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9159d76a-52b7-4262-a56a-ed28caec7f97/mysql-bootstrap/0.log" Sep 30 13:55:42 crc kubenswrapper[4672]: I0930 13:55:42.849759 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e9e19e97-1725-4846-93ef-b00bf092908b/nova-scheduler-scheduler/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.015494 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9159d76a-52b7-4262-a56a-ed28caec7f97/galera/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.029686 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9159d76a-52b7-4262-a56a-ed28caec7f97/mysql-bootstrap/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.223363 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_056e0424-1faf-4d5a-8aea-e351214b3394/mysql-bootstrap/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.431018 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_056e0424-1faf-4d5a-8aea-e351214b3394/mysql-bootstrap/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.446815 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_056e0424-1faf-4d5a-8aea-e351214b3394/galera/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.667185 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_28f655e1-08b3-4618-8864-2020e883f99c/openstackclient/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.695719 4672 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ptvfl" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="registry-server" probeResult="failure" output=< Sep 30 13:55:43 crc kubenswrapper[4672]: timeout: failed to connect service ":50051" within 1s Sep 30 13:55:43 crc kubenswrapper[4672]: > Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.718661 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gln7l_f11c1379-e576-40be-a37a-1d73f84cab81/openstack-network-exporter/0.log" Sep 30 13:55:43 crc kubenswrapper[4672]: I0930 13:55:43.913048 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovsdb-server-init/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.035161 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_db8c9818-e7bc-471f-b7d6-b097f3657451/nova-metadata-metadata/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.142782 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovsdb-server-init/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.172073 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovsdb-server/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.375165 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vhs7r_9f35be26-490e-49db-bd31-32ce35c84fab/ovn-controller/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.379131 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gb7pq_71bceb54-c562-417a-8897-525930836f44/ovs-vswitchd/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.540016 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-mfrgw_de8ad7f9-8f0f-46fb-8649-9dd7bb8abc16/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.577004 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_879d3e20-6df1-45be-bebd-b7e990e0aa5f/openstack-network-exporter/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.665839 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_879d3e20-6df1-45be-bebd-b7e990e0aa5f/ovn-northd/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.749943 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d69b4d7c-be99-4405-aae9-8a11b85632b8/openstack-network-exporter/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.793282 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d69b4d7c-be99-4405-aae9-8a11b85632b8/ovsdbserver-nb/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.905680 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4e0bc671-11e7-442d-b5f3-4a901b0a0a80/openstack-network-exporter/0.log" Sep 30 13:55:44 crc kubenswrapper[4672]: I0930 13:55:44.941073 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4e0bc671-11e7-442d-b5f3-4a901b0a0a80/ovsdbserver-sb/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.170025 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b77669d6-hlcjq_1834109c-113d-4231-94e6-0796ef06015d/placement-api/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.265760 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/init-config-reloader/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.293733 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b77669d6-hlcjq_1834109c-113d-4231-94e6-0796ef06015d/placement-log/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.477214 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/config-reloader/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.477486 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/thanos-sidecar/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.477976 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/prometheus/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.509473 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5a41d16c-3326-4d4f-a01a-0f8c436aa9b0/init-config-reloader/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.645784 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00b691a7-21bd-4661-9b19-cae31a79f18e/setup-container/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.799986 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00b691a7-21bd-4661-9b19-cae31a79f18e/setup-container/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.828873 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00b691a7-21bd-4661-9b19-cae31a79f18e/rabbitmq/0.log" Sep 30 13:55:45 crc kubenswrapper[4672]: I0930 13:55:45.835477 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d165a3a8-6809-46e5-bd35-895200ab5bfc/setup-container/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.027291 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d165a3a8-6809-46e5-bd35-895200ab5bfc/setup-container/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.043878 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d165a3a8-6809-46e5-bd35-895200ab5bfc/rabbitmq/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.063241 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cfe6bf3-4d65-49c0-a45b-484e53a12f80/setup-container/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.275700 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cfe6bf3-4d65-49c0-a45b-484e53a12f80/setup-container/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.313489 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-f2mbl_c3a5c5b9-3aa7-4fef-9450-0e82ab08bb3a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.315174 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6cfe6bf3-4d65-49c0-a45b-484e53a12f80/rabbitmq/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.490497 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kz9kk_c67e6191-fd96-4caf-a6fd-6d5a7013f069/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.537342 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bht9w_9ac7c321-e380-4baa-8233-0ec24fa6496f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.691880 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bh9j6_754f9f15-2c0e-4279-aec1-589d1b23eb75/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.736914 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9csw2_c303b53d-3c71-498b-99fb-432610f75b61/ssh-known-hosts-edpm-deployment/0.log" Sep 30 13:55:46 crc kubenswrapper[4672]: I0930 13:55:46.893519 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc57698f-2dqjq_eb422aba-f5c2-4822-bd48-bba56e4dc451/proxy-server/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.093968 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76bc57698f-2dqjq_eb422aba-f5c2-4822-bd48-bba56e4dc451/proxy-httpd/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.115118 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-d7dm7_4b567440-2a47-4032-bd02-6d7d53ea35b8/swift-ring-rebalance/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.189348 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-auditor/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.447179 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-reaper/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.482581 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-replicator/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.545282 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/account-server/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.546846 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-auditor/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.661924 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-replicator/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.676193 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-server/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.773443 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/container-updater/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.775500 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-auditor/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.844440 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-expirer/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.866003 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-replicator/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.960904 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-updater/0.log" Sep 30 13:55:47 crc kubenswrapper[4672]: I0930 13:55:47.983327 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/object-server/0.log" Sep 30 13:55:48 crc kubenswrapper[4672]: I0930 13:55:48.012493 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/rsync/0.log" Sep 30 13:55:48 crc kubenswrapper[4672]: I0930 13:55:48.047003 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3cc34662-d100-4436-9067-c615b7b3f83f/swift-recon-cron/0.log" Sep 30 13:55:48 crc kubenswrapper[4672]: I0930 13:55:48.181586 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-kqdc5_38c0d8da-6872-4108-aaa8-1b8fa2611fe5/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:48 crc kubenswrapper[4672]: I0930 13:55:48.281322 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_42b7a077-06bd-4f39-a1b7-e4692592ae68/tempest-tests-tempest-tests-runner/0.log" Sep 30 13:55:48 crc kubenswrapper[4672]: I0930 13:55:48.408883 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_44cf0b0d-200e-475c-b8df-965a362a13b9/test-operator-logs-container/0.log" Sep 30 13:55:48 crc kubenswrapper[4672]: I0930 13:55:48.479734 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-sjlq2_beb806bb-fd09-449f-939b-cccb4ffe11de/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 13:55:49 crc kubenswrapper[4672]: I0930 13:55:49.307492 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_0944504c-77dc-42f3-a981-723fea76118c/watcher-applier/0.log" Sep 30 13:55:49 crc kubenswrapper[4672]: I0930 13:55:49.553573 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_179ce1e6-946c-4e3c-97d6-38764daf4214/watcher-api-log/0.log" Sep 30 13:55:49 crc kubenswrapper[4672]: I0930 13:55:49.570279 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_4f1bee84-650b-4f0b-a657-e6701ee51823/watcher-decision-engine/2.log" Sep 30 13:55:51 crc kubenswrapper[4672]: I0930 13:55:51.621732 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_4f1bee84-650b-4f0b-a657-e6701ee51823/watcher-decision-engine/3.log" Sep 30 13:55:52 crc kubenswrapper[4672]: I0930 13:55:52.683091 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_179ce1e6-946c-4e3c-97d6-38764daf4214/watcher-api/0.log" Sep 30 13:55:52 crc kubenswrapper[4672]: I0930 13:55:52.720277 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:52 crc kubenswrapper[4672]: I0930 13:55:52.771207 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:53 crc kubenswrapper[4672]: I0930 13:55:53.445994 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ptvfl"] Sep 30 13:55:54 crc kubenswrapper[4672]: I0930 13:55:54.212630 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ptvfl" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="registry-server" containerID="cri-o://05278a6ce0cacb6db6f6f53f98ff99dc995d43edfd9ad820a2b2a983bfea0df5" gracePeriod=2 Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.223815 4672 generic.go:334] "Generic (PLEG): container finished" podID="cc611d86-5c28-48fc-b837-2971f2056398" containerID="05278a6ce0cacb6db6f6f53f98ff99dc995d43edfd9ad820a2b2a983bfea0df5" exitCode=0 Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.223896 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptvfl" event={"ID":"cc611d86-5c28-48fc-b837-2971f2056398","Type":"ContainerDied","Data":"05278a6ce0cacb6db6f6f53f98ff99dc995d43edfd9ad820a2b2a983bfea0df5"} Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.224089 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptvfl" event={"ID":"cc611d86-5c28-48fc-b837-2971f2056398","Type":"ContainerDied","Data":"49bd91411ae9277c08a97b1ee3a5a5a999b442898881db75dd5b8dd27c440bab"} Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.224107 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49bd91411ae9277c08a97b1ee3a5a5a999b442898881db75dd5b8dd27c440bab" Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.261673 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.375517 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-utilities\") pod \"cc611d86-5c28-48fc-b837-2971f2056398\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.375625 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-catalog-content\") pod \"cc611d86-5c28-48fc-b837-2971f2056398\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.375779 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s7ll\" (UniqueName: \"kubernetes.io/projected/cc611d86-5c28-48fc-b837-2971f2056398-kube-api-access-6s7ll\") pod \"cc611d86-5c28-48fc-b837-2971f2056398\" (UID: \"cc611d86-5c28-48fc-b837-2971f2056398\") " Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.376635 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-utilities" (OuterVolumeSpecName: "utilities") pod "cc611d86-5c28-48fc-b837-2971f2056398" (UID: "cc611d86-5c28-48fc-b837-2971f2056398"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.388535 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc611d86-5c28-48fc-b837-2971f2056398-kube-api-access-6s7ll" (OuterVolumeSpecName: "kube-api-access-6s7ll") pod "cc611d86-5c28-48fc-b837-2971f2056398" (UID: "cc611d86-5c28-48fc-b837-2971f2056398"). InnerVolumeSpecName "kube-api-access-6s7ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.431765 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc611d86-5c28-48fc-b837-2971f2056398" (UID: "cc611d86-5c28-48fc-b837-2971f2056398"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.487482 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s7ll\" (UniqueName: \"kubernetes.io/projected/cc611d86-5c28-48fc-b837-2971f2056398-kube-api-access-6s7ll\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.487827 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:55 crc kubenswrapper[4672]: I0930 13:55:55.487839 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc611d86-5c28-48fc-b837-2971f2056398-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:55:56 crc kubenswrapper[4672]: I0930 13:55:56.231600 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ptvfl" Sep 30 13:55:56 crc kubenswrapper[4672]: I0930 13:55:56.252659 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ptvfl"] Sep 30 13:55:56 crc kubenswrapper[4672]: I0930 13:55:56.263504 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ptvfl"] Sep 30 13:55:57 crc kubenswrapper[4672]: I0930 13:55:57.428538 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc611d86-5c28-48fc-b837-2971f2056398" path="/var/lib/kubelet/pods/cc611d86-5c28-48fc-b837-2971f2056398/volumes" Sep 30 13:56:03 crc kubenswrapper[4672]: I0930 13:56:03.458218 4672 scope.go:117] "RemoveContainer" containerID="77c2527175aa60c1a25804ffda0b881f674398466a5e16a4092b83d5ebe1f6eb" Sep 30 13:56:25 crc kubenswrapper[4672]: I0930 13:56:25.535525 4672 generic.go:334] "Generic (PLEG): container finished" podID="7395867b-6ba9-498f-80a2-c10013643700" containerID="74bc5b5cac8663255bf1ba8eba84b26e1bf395650c324e35b6273373df14c147" exitCode=0 Sep 30 13:56:25 crc kubenswrapper[4672]: I0930 13:56:25.535633 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" event={"ID":"7395867b-6ba9-498f-80a2-c10013643700","Type":"ContainerDied","Data":"74bc5b5cac8663255bf1ba8eba84b26e1bf395650c324e35b6273373df14c147"} Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.660708 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.713181 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-b4b99"] Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.721233 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-b4b99"] Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.740097 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhhlh\" (UniqueName: \"kubernetes.io/projected/7395867b-6ba9-498f-80a2-c10013643700-kube-api-access-rhhlh\") pod \"7395867b-6ba9-498f-80a2-c10013643700\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.740855 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7395867b-6ba9-498f-80a2-c10013643700-host\") pod \"7395867b-6ba9-498f-80a2-c10013643700\" (UID: \"7395867b-6ba9-498f-80a2-c10013643700\") " Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.740907 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7395867b-6ba9-498f-80a2-c10013643700-host" (OuterVolumeSpecName: "host") pod "7395867b-6ba9-498f-80a2-c10013643700" (UID: "7395867b-6ba9-498f-80a2-c10013643700"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.741598 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7395867b-6ba9-498f-80a2-c10013643700-host\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.747127 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7395867b-6ba9-498f-80a2-c10013643700-kube-api-access-rhhlh" (OuterVolumeSpecName: "kube-api-access-rhhlh") pod "7395867b-6ba9-498f-80a2-c10013643700" (UID: "7395867b-6ba9-498f-80a2-c10013643700"). InnerVolumeSpecName "kube-api-access-rhhlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:26 crc kubenswrapper[4672]: I0930 13:56:26.843992 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhhlh\" (UniqueName: \"kubernetes.io/projected/7395867b-6ba9-498f-80a2-c10013643700-kube-api-access-rhhlh\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.428187 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7395867b-6ba9-498f-80a2-c10013643700" path="/var/lib/kubelet/pods/7395867b-6ba9-498f-80a2-c10013643700/volumes" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.564608 4672 scope.go:117] "RemoveContainer" containerID="74bc5b5cac8663255bf1ba8eba84b26e1bf395650c324e35b6273373df14c147" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.564710 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-b4b99" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.859482 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-7bwkh"] Sep 30 13:56:27 crc kubenswrapper[4672]: E0930 13:56:27.860016 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7395867b-6ba9-498f-80a2-c10013643700" containerName="container-00" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.860040 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7395867b-6ba9-498f-80a2-c10013643700" containerName="container-00" Sep 30 13:56:27 crc kubenswrapper[4672]: E0930 13:56:27.860080 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="extract-utilities" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.860093 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="extract-utilities" Sep 30 13:56:27 crc kubenswrapper[4672]: E0930 13:56:27.860112 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="extract-content" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.860123 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="extract-content" Sep 30 13:56:27 crc kubenswrapper[4672]: E0930 13:56:27.860174 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="registry-server" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.860183 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="registry-server" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.860500 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc611d86-5c28-48fc-b837-2971f2056398" containerName="registry-server" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.860527 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7395867b-6ba9-498f-80a2-c10013643700" containerName="container-00" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.861561 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.968505 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-host\") pod \"crc-debug-7bwkh\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:27 crc kubenswrapper[4672]: I0930 13:56:27.968851 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7p49\" (UniqueName: \"kubernetes.io/projected/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-kube-api-access-c7p49\") pod \"crc-debug-7bwkh\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.071451 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-host\") pod \"crc-debug-7bwkh\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.071584 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7p49\" (UniqueName: \"kubernetes.io/projected/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-kube-api-access-c7p49\") pod \"crc-debug-7bwkh\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.072415 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-host\") pod \"crc-debug-7bwkh\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.098224 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7p49\" (UniqueName: \"kubernetes.io/projected/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-kube-api-access-c7p49\") pod \"crc-debug-7bwkh\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.187625 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.579499 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" event={"ID":"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf","Type":"ContainerStarted","Data":"fcac87bb575babffd9c306a2fe97963995f6c70e994ac924c4194fbaf039f8c0"} Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.580021 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" event={"ID":"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf","Type":"ContainerStarted","Data":"a5c44126b3d9545361f516e9353307309ac804c6f3b88db8b3bc91fb654f4775"} Sep 30 13:56:28 crc kubenswrapper[4672]: I0930 13:56:28.604495 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" podStartSLOduration=1.604472974 podStartE2EDuration="1.604472974s" podCreationTimestamp="2025-09-30 13:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:56:28.597936157 +0000 UTC m=+5679.867173813" watchObservedRunningTime="2025-09-30 13:56:28.604472974 +0000 UTC m=+5679.873710620" Sep 30 13:56:29 crc kubenswrapper[4672]: I0930 13:56:29.596052 4672 generic.go:334] "Generic (PLEG): container finished" podID="de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf" containerID="fcac87bb575babffd9c306a2fe97963995f6c70e994ac924c4194fbaf039f8c0" exitCode=0 Sep 30 13:56:29 crc kubenswrapper[4672]: I0930 13:56:29.596105 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" event={"ID":"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf","Type":"ContainerDied","Data":"fcac87bb575babffd9c306a2fe97963995f6c70e994ac924c4194fbaf039f8c0"} Sep 30 13:56:30 crc kubenswrapper[4672]: I0930 13:56:30.722635 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:30 crc kubenswrapper[4672]: I0930 13:56:30.821616 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7p49\" (UniqueName: \"kubernetes.io/projected/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-kube-api-access-c7p49\") pod \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " Sep 30 13:56:30 crc kubenswrapper[4672]: I0930 13:56:30.821927 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-host\") pod \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\" (UID: \"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf\") " Sep 30 13:56:30 crc kubenswrapper[4672]: I0930 13:56:30.821990 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-host" (OuterVolumeSpecName: "host") pod "de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf" (UID: "de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:30 crc kubenswrapper[4672]: I0930 13:56:30.822902 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-host\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:30 crc kubenswrapper[4672]: I0930 13:56:30.829535 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-kube-api-access-c7p49" (OuterVolumeSpecName: "kube-api-access-c7p49") pod "de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf" (UID: "de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf"). InnerVolumeSpecName "kube-api-access-c7p49". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:30 crc kubenswrapper[4672]: I0930 13:56:30.924240 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7p49\" (UniqueName: \"kubernetes.io/projected/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf-kube-api-access-c7p49\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:31 crc kubenswrapper[4672]: I0930 13:56:31.615714 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" event={"ID":"de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf","Type":"ContainerDied","Data":"a5c44126b3d9545361f516e9353307309ac804c6f3b88db8b3bc91fb654f4775"} Sep 30 13:56:31 crc kubenswrapper[4672]: I0930 13:56:31.615762 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c44126b3d9545361f516e9353307309ac804c6f3b88db8b3bc91fb654f4775" Sep 30 13:56:31 crc kubenswrapper[4672]: I0930 13:56:31.615834 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-7bwkh" Sep 30 13:56:38 crc kubenswrapper[4672]: I0930 13:56:38.079969 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-7bwkh"] Sep 30 13:56:38 crc kubenswrapper[4672]: I0930 13:56:38.089929 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-7bwkh"] Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.243865 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-66mdh"] Sep 30 13:56:39 crc kubenswrapper[4672]: E0930 13:56:39.244585 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf" containerName="container-00" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.244598 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf" containerName="container-00" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.244819 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf" containerName="container-00" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.245534 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.272627 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlshw\" (UniqueName: \"kubernetes.io/projected/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-kube-api-access-xlshw\") pod \"crc-debug-66mdh\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.272790 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-host\") pod \"crc-debug-66mdh\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.374925 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlshw\" (UniqueName: \"kubernetes.io/projected/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-kube-api-access-xlshw\") pod \"crc-debug-66mdh\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.375027 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-host\") pod \"crc-debug-66mdh\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.375135 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-host\") pod \"crc-debug-66mdh\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.394198 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlshw\" (UniqueName: \"kubernetes.io/projected/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-kube-api-access-xlshw\") pod \"crc-debug-66mdh\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.430968 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf" path="/var/lib/kubelet/pods/de0abdf9-ae00-4a62-a15a-7b8e00e0e1bf/volumes" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.563070 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:39 crc kubenswrapper[4672]: I0930 13:56:39.698462 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-66mdh" event={"ID":"8871bf22-42a7-4b1e-b9ec-d4276431c6b0","Type":"ContainerStarted","Data":"ac734dd6669b61277347cc88f753769376ea39d67dd0ce17f658e10834ec8e2f"} Sep 30 13:56:40 crc kubenswrapper[4672]: I0930 13:56:40.711095 4672 generic.go:334] "Generic (PLEG): container finished" podID="8871bf22-42a7-4b1e-b9ec-d4276431c6b0" containerID="22261d9a13ab537ab61c9a4e176e76d37e9c45bd4d7aa326ef4ba15aecf16564" exitCode=0 Sep 30 13:56:40 crc kubenswrapper[4672]: I0930 13:56:40.711136 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/crc-debug-66mdh" event={"ID":"8871bf22-42a7-4b1e-b9ec-d4276431c6b0","Type":"ContainerDied","Data":"22261d9a13ab537ab61c9a4e176e76d37e9c45bd4d7aa326ef4ba15aecf16564"} Sep 30 13:56:40 crc kubenswrapper[4672]: I0930 13:56:40.752586 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-66mdh"] Sep 30 13:56:40 crc kubenswrapper[4672]: I0930 13:56:40.760791 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9cj4/crc-debug-66mdh"] Sep 30 13:56:41 crc kubenswrapper[4672]: I0930 13:56:41.854367 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:41 crc kubenswrapper[4672]: I0930 13:56:41.930168 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-host\") pod \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " Sep 30 13:56:41 crc kubenswrapper[4672]: I0930 13:56:41.930219 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlshw\" (UniqueName: \"kubernetes.io/projected/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-kube-api-access-xlshw\") pod \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\" (UID: \"8871bf22-42a7-4b1e-b9ec-d4276431c6b0\") " Sep 30 13:56:41 crc kubenswrapper[4672]: I0930 13:56:41.931515 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-host" (OuterVolumeSpecName: "host") pod "8871bf22-42a7-4b1e-b9ec-d4276431c6b0" (UID: "8871bf22-42a7-4b1e-b9ec-d4276431c6b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 13:56:41 crc kubenswrapper[4672]: I0930 13:56:41.936170 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-kube-api-access-xlshw" (OuterVolumeSpecName: "kube-api-access-xlshw") pod "8871bf22-42a7-4b1e-b9ec-d4276431c6b0" (UID: "8871bf22-42a7-4b1e-b9ec-d4276431c6b0"). InnerVolumeSpecName "kube-api-access-xlshw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.032647 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlshw\" (UniqueName: \"kubernetes.io/projected/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-kube-api-access-xlshw\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.032694 4672 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8871bf22-42a7-4b1e-b9ec-d4276431c6b0-host\") on node \"crc\" DevicePath \"\"" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.534293 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/util/0.log" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.746670 4672 scope.go:117] "RemoveContainer" containerID="22261d9a13ab537ab61c9a4e176e76d37e9c45bd4d7aa326ef4ba15aecf16564" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.746752 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/crc-debug-66mdh" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.769518 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/pull/0.log" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.812886 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/pull/0.log" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.839178 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/util/0.log" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.959714 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/util/0.log" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.961864 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/pull/0.log" Sep 30 13:56:42 crc kubenswrapper[4672]: I0930 13:56:42.989211 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3ba63d1583e0abeae0ee70583752a42c45e81151d65ed21cfdbdcd61d5wnrn9_47ed0e84-3ade-4314-b9fc-6e2f3e77c7b9/extract/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.129964 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-df89s_c70c8b28-1f45-4c79-af69-3197c7f66fa0/kube-rbac-proxy/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.221053 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-df89s_c70c8b28-1f45-4c79-af69-3197c7f66fa0/manager/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.249959 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-jmkcr_892875f4-bce3-47cb-8478-9d6bbc819bb1/kube-rbac-proxy/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.373119 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-jmkcr_892875f4-bce3-47cb-8478-9d6bbc819bb1/manager/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.407731 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-gthb7_6cc4cd4e-abd0-4318-bfd4-e2df45940139/kube-rbac-proxy/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.445598 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8871bf22-42a7-4b1e-b9ec-d4276431c6b0" path="/var/lib/kubelet/pods/8871bf22-42a7-4b1e-b9ec-d4276431c6b0/volumes" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.459801 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-gthb7_6cc4cd4e-abd0-4318-bfd4-e2df45940139/manager/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.605221 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fn2cb_0e2c3398-4a1f-4a82-a95c-89e73d9a4485/kube-rbac-proxy/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.647677 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-fn2cb_0e2c3398-4a1f-4a82-a95c-89e73d9a4485/manager/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.798087 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-vs4mr_b12c1847-2238-4a91-a2a0-4de492556fe7/manager/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.798149 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-vs4mr_b12c1847-2238-4a91-a2a0-4de492556fe7/kube-rbac-proxy/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.872408 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-z9tmm_ed25b409-1ca7-4fc4-95b5-55b4239233f3/kube-rbac-proxy/0.log" Sep 30 13:56:43 crc kubenswrapper[4672]: I0930 13:56:43.992631 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-z9tmm_ed25b409-1ca7-4fc4-95b5-55b4239233f3/manager/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.064121 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-8qftk_6fa26cab-ae65-4e21-af16-2628c86be254/kube-rbac-proxy/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.239784 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-8qftk_6fa26cab-ae65-4e21-af16-2628c86be254/manager/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.262805 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-g8lll_904a2d6e-693a-4c5e-926e-2c5fd47d6bea/kube-rbac-proxy/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.274702 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-g8lll_904a2d6e-693a-4c5e-926e-2c5fd47d6bea/manager/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.398178 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-vqxhh_ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0/kube-rbac-proxy/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.523309 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-vqxhh_ed04bd4c-39dd-45fa-a2e0-bde94ff9deb0/manager/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.584522 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2hvxs_32830807-0fb2-4545-a629-af52b20e0b0f/kube-rbac-proxy/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.590730 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-2hvxs_32830807-0fb2-4545-a629-af52b20e0b0f/manager/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.718218 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-xq89z_3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06/kube-rbac-proxy/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.799570 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-xq89z_3f0b65a1-c4dd-4ca1-a2cf-feea808b1f06/manager/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.878071 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-lpv98_9aef7bd6-dab2-4333-b248-a40c44bc3743/kube-rbac-proxy/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.952050 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-lpv98_9aef7bd6-dab2-4333-b248-a40c44bc3743/manager/0.log" Sep 30 13:56:44 crc kubenswrapper[4672]: I0930 13:56:44.981940 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-2kvn7_d4c88a65-e12f-4872-baf2-f210ee1b0c9a/kube-rbac-proxy/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.139761 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-2kvn7_d4c88a65-e12f-4872-baf2-f210ee1b0c9a/manager/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.151214 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-hjn2l_601fcd4a-dc2f-468d-9ad6-6b173320c317/kube-rbac-proxy/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.189620 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-hjn2l_601fcd4a-dc2f-468d-9ad6-6b173320c317/manager/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.305021 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-x6x67_1b6d7bf0-00ff-41df-8873-cab7f6e5eeea/kube-rbac-proxy/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.328817 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-x6x67_1b6d7bf0-00ff-41df-8873-cab7f6e5eeea/manager/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.513319 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bd96d9cc5-cf4sl_711b218e-6a25-4d4f-b657-e621e9d1d658/kube-rbac-proxy/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.607852 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-b6f46bc96-v9ff6_fc7ec117-7036-452f-9b2d-894e0dd29a8f/kube-rbac-proxy/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.936180 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6rjvm_63d18a29-8d30-437c-af1b-f9cd9fa99b6b/registry-server/0.log" Sep 30 13:56:45 crc kubenswrapper[4672]: I0930 13:56:45.979062 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-b6f46bc96-v9ff6_fc7ec117-7036-452f-9b2d-894e0dd29a8f/operator/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.271087 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-zxb4j_5a756e54-c5bf-480b-aa89-57ca440d1ddc/kube-rbac-proxy/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.313579 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-r9sqd_42bf0afd-961a-4353-9499-a185b16b8a02/kube-rbac-proxy/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.326827 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-zxb4j_5a756e54-c5bf-480b-aa89-57ca440d1ddc/manager/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.477113 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-r9sqd_42bf0afd-961a-4353-9499-a185b16b8a02/manager/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.571515 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-qxpkk_4f1099dc-e100-44d5-8d17-255dbe0edf63/operator/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.671659 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bd96d9cc5-cf4sl_711b218e-6a25-4d4f-b657-e621e9d1d658/manager/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.736775 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-8mf7d_cf21b89f-fcd8-4854-954b-06927bc7c6ea/manager/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.767645 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-8mf7d_cf21b89f-fcd8-4854-954b-06927bc7c6ea/kube-rbac-proxy/0.log" Sep 30 13:56:46 crc kubenswrapper[4672]: I0930 13:56:46.793484 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-qxq7h_e6b8eb11-36d8-45c1-b600-76ffff076b78/kube-rbac-proxy/0.log" Sep 30 13:56:47 crc kubenswrapper[4672]: I0930 13:56:47.019525 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-d7gbf_74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7/manager/0.log" Sep 30 13:56:47 crc kubenswrapper[4672]: I0930 13:56:47.030091 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-d7gbf_74b0a0e9-a7e6-43e6-a15d-86e6a20c5aa7/kube-rbac-proxy/0.log" Sep 30 13:56:47 crc kubenswrapper[4672]: I0930 13:56:47.048589 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-qxq7h_e6b8eb11-36d8-45c1-b600-76ffff076b78/manager/0.log" Sep 30 13:56:47 crc kubenswrapper[4672]: I0930 13:56:47.177210 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58675bf858-qg9s4_4d10ceb0-c730-4fb8-b81c-a87e33890f84/kube-rbac-proxy/0.log" Sep 30 13:56:47 crc kubenswrapper[4672]: I0930 13:56:47.236331 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58675bf858-qg9s4_4d10ceb0-c730-4fb8-b81c-a87e33890f84/manager/0.log" Sep 30 13:56:54 crc kubenswrapper[4672]: I0930 13:56:54.739799 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:56:54 crc kubenswrapper[4672]: I0930 13:56:54.740321 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:57:03 crc kubenswrapper[4672]: I0930 13:57:03.032822 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dljhp_62422e21-39c6-4772-8f59-33be3d16c368/control-plane-machine-set-operator/0.log" Sep 30 13:57:03 crc kubenswrapper[4672]: I0930 13:57:03.221788 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qrbqf_a7555c95-5534-45dc-a212-4262554a0c0b/kube-rbac-proxy/0.log" Sep 30 13:57:03 crc kubenswrapper[4672]: I0930 13:57:03.229623 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qrbqf_a7555c95-5534-45dc-a212-4262554a0c0b/machine-api-operator/0.log" Sep 30 13:57:15 crc kubenswrapper[4672]: I0930 13:57:15.441753 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-46756_b8ab6541-957a-44c8-a773-788f725d7efb/cert-manager-controller/0.log" Sep 30 13:57:15 crc kubenswrapper[4672]: I0930 13:57:15.691757 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-zcp7d_3c4da14a-ed20-4c47-8147-2150a416c1c8/cert-manager-cainjector/0.log" Sep 30 13:57:15 crc kubenswrapper[4672]: I0930 13:57:15.761058 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-4r2wn_e53f49ff-ce7a-4699-977e-730d462910c8/cert-manager-webhook/0.log" Sep 30 13:57:24 crc kubenswrapper[4672]: I0930 13:57:24.739722 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:57:24 crc kubenswrapper[4672]: I0930 13:57:24.740575 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:57:27 crc kubenswrapper[4672]: I0930 13:57:27.779859 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-fphxt_8394a3ed-db36-4579-ac2b-8e3f1ce579d1/nmstate-console-plugin/0.log" Sep 30 13:57:28 crc kubenswrapper[4672]: I0930 13:57:28.001567 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqkht_a767db76-9b74-4ab7-a541-5f5981850723/kube-rbac-proxy/0.log" Sep 30 13:57:28 crc kubenswrapper[4672]: I0930 13:57:28.025173 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-d6nbx_fa1a3970-c37a-4cdf-ba19-b868c581c02e/nmstate-handler/0.log" Sep 30 13:57:28 crc kubenswrapper[4672]: I0930 13:57:28.076462 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqkht_a767db76-9b74-4ab7-a541-5f5981850723/nmstate-metrics/0.log" Sep 30 13:57:28 crc kubenswrapper[4672]: I0930 13:57:28.274949 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-qn8f5_845f56f6-18db-419a-9901-e2c4c186ad88/nmstate-operator/0.log" Sep 30 13:57:28 crc kubenswrapper[4672]: I0930 13:57:28.300978 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-4plc5_6573b5fd-c58a-4016-84ad-a21aa5622e2a/nmstate-webhook/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.331597 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-9d8qs_56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1/kube-rbac-proxy/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.431795 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-9d8qs_56c8e0ed-01f5-4d61-84ad-78ce7c5a66d1/controller/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.551149 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-bpwqj_8a1bda96-fd76-4372-bb9f-ae56e6602caf/frr-k8s-webhook-server/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.638195 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.829077 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.841760 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.900333 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:57:42 crc kubenswrapper[4672]: I0930 13:57:42.905070 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.003305 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.068534 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.069701 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.084129 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.279808 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-frr-files/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.280457 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-metrics/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.288046 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/cp-reloader/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.324727 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/controller/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.460813 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/kube-rbac-proxy/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.525762 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/kube-rbac-proxy-frr/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.537672 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/frr-metrics/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.672385 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/reloader/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.828491 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-879d84ff8-vhxdg_2831267a-a276-41c3-afaf-c262071b60c7/manager/0.log" Sep 30 13:57:43 crc kubenswrapper[4672]: I0930 13:57:43.974330 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5dddf5dfdb-xtn27_bbcc3167-a28b-47c0-93a5-cab38ea7d13b/webhook-server/0.log" Sep 30 13:57:44 crc kubenswrapper[4672]: I0930 13:57:44.107241 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v66pl_07eaf50f-6d5e-4e3e-8c3d-1e28769bae68/kube-rbac-proxy/0.log" Sep 30 13:57:44 crc kubenswrapper[4672]: I0930 13:57:44.739305 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v66pl_07eaf50f-6d5e-4e3e-8c3d-1e28769bae68/speaker/0.log" Sep 30 13:57:45 crc kubenswrapper[4672]: I0930 13:57:45.255185 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zchdp_d6349248-bcbd-486b-8143-90b66a52f017/frr/0.log" Sep 30 13:57:54 crc kubenswrapper[4672]: I0930 13:57:54.739887 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 13:57:54 crc kubenswrapper[4672]: I0930 13:57:54.740523 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 13:57:54 crc kubenswrapper[4672]: I0930 13:57:54.740573 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 13:57:54 crc kubenswrapper[4672]: I0930 13:57:54.741171 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"966f5125052c74b4e7c550103d16e6fa385f49fb5552ce6b0019712558a5916e"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 13:57:54 crc kubenswrapper[4672]: I0930 13:57:54.741234 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://966f5125052c74b4e7c550103d16e6fa385f49fb5552ce6b0019712558a5916e" gracePeriod=600 Sep 30 13:57:55 crc kubenswrapper[4672]: I0930 13:57:55.409087 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="966f5125052c74b4e7c550103d16e6fa385f49fb5552ce6b0019712558a5916e" exitCode=0 Sep 30 13:57:55 crc kubenswrapper[4672]: I0930 13:57:55.409151 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"966f5125052c74b4e7c550103d16e6fa385f49fb5552ce6b0019712558a5916e"} Sep 30 13:57:55 crc kubenswrapper[4672]: I0930 13:57:55.409581 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerStarted","Data":"c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d"} Sep 30 13:57:55 crc kubenswrapper[4672]: I0930 13:57:55.409600 4672 scope.go:117] "RemoveContainer" containerID="a0f4e52abdc31f3136a3c09d788135a4e897756a0e739d1de81796de70d2aa59" Sep 30 13:57:56 crc kubenswrapper[4672]: I0930 13:57:56.733577 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/util/0.log" Sep 30 13:57:56 crc kubenswrapper[4672]: I0930 13:57:56.895259 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/util/0.log" Sep 30 13:57:56 crc kubenswrapper[4672]: I0930 13:57:56.922111 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/pull/0.log" Sep 30 13:57:56 crc kubenswrapper[4672]: I0930 13:57:56.937021 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/pull/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.123218 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/util/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.123800 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/extract/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.182469 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc7tdmb_138ae4d9-a29c-4679-b3fa-7953a95cee51/pull/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.335084 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/util/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.501833 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/pull/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.508299 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/util/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.527843 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/pull/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.673446 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/extract/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.698138 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/util/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.711586 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddnm8k_9d05223e-1c34-4132-92a1-1b96ef8c1a8b/pull/0.log" Sep 30 13:57:57 crc kubenswrapper[4672]: I0930 13:57:57.853254 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-utilities/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.035377 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-utilities/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.045590 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-content/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.052362 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-content/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.221367 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-utilities/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.276023 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/extract-content/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.440140 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-utilities/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.706354 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-content/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.723108 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-utilities/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.799086 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-content/0.log" Sep 30 13:57:58 crc kubenswrapper[4672]: I0930 13:57:58.950072 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-utilities/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.026430 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/extract-content/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.160841 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-n27k6_a213a95b-eb38-4932-9c67-11f1b91d0202/registry-server/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.252342 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/util/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.592156 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b2lh4_7184a4fd-1911-473b-83c4-c5c224130bb3/registry-server/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.594297 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/util/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.608424 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/pull/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.651035 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/pull/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.834868 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/extract/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.862886 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/util/0.log" Sep 30 13:57:59 crc kubenswrapper[4672]: I0930 13:57:59.871698 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96s4dl2_24166562-adf7-422d-abfa-b1b7176f0124/pull/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.073826 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kxjq6_08b96597-cb4d-4c38-9557-d60b937ab2c7/marketplace-operator/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.123555 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-utilities/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.295972 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-utilities/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.310900 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-content/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.315841 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-content/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.488940 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-utilities/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.520937 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-content/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.547173 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/extract-utilities/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.729591 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-utilities/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.739545 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xpjr4_9a3da1a5-1d7f-4d33-9245-55038dd253d3/registry-server/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.893604 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-content/0.log" Sep 30 13:58:00 crc kubenswrapper[4672]: I0930 13:58:00.960151 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-content/0.log" Sep 30 13:58:01 crc kubenswrapper[4672]: I0930 13:58:01.141699 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-content/0.log" Sep 30 13:58:01 crc kubenswrapper[4672]: I0930 13:58:01.154181 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/extract-utilities/0.log" Sep 30 13:58:01 crc kubenswrapper[4672]: I0930 13:58:01.888201 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lkhn2_f337a53e-90b5-44a2-a033-bf26d3498158/registry-server/0.log" Sep 30 13:58:12 crc kubenswrapper[4672]: I0930 13:58:12.769182 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-k7drs_cf46fa05-32de-4c26-82e7-769052afcaa1/prometheus-operator/0.log" Sep 30 13:58:12 crc kubenswrapper[4672]: I0930 13:58:12.881215 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67cf9f7c94-kf944_eaec0e45-413d-4fad-a35b-68a28486053a/prometheus-operator-admission-webhook/0.log" Sep 30 13:58:12 crc kubenswrapper[4672]: I0930 13:58:12.974168 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67cf9f7c94-vw6lq_95eba142-c439-4920-914e-af904642acc2/prometheus-operator-admission-webhook/0.log" Sep 30 13:58:13 crc kubenswrapper[4672]: I0930 13:58:13.084802 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-fzmsr_a4a3a18a-31ce-496c-b863-bdc8ff9774cb/operator/0.log" Sep 30 13:58:13 crc kubenswrapper[4672]: I0930 13:58:13.163972 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-27rgt_961561c7-4ef8-4592-bb9a-53ef762e38ea/perses-operator/0.log" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.448208 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nzggh"] Sep 30 13:58:35 crc kubenswrapper[4672]: E0930 13:58:35.449160 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8871bf22-42a7-4b1e-b9ec-d4276431c6b0" containerName="container-00" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.449173 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="8871bf22-42a7-4b1e-b9ec-d4276431c6b0" containerName="container-00" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.449419 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="8871bf22-42a7-4b1e-b9ec-d4276431c6b0" containerName="container-00" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.451336 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.467864 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzggh"] Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.590907 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-utilities\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.591130 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-catalog-content\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.591172 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chx5c\" (UniqueName: \"kubernetes.io/projected/9edd6b96-420e-4333-806b-efefaeb1fe95-kube-api-access-chx5c\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.692929 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-utilities\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.693098 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-catalog-content\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.693127 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chx5c\" (UniqueName: \"kubernetes.io/projected/9edd6b96-420e-4333-806b-efefaeb1fe95-kube-api-access-chx5c\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.693542 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-utilities\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.693619 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-catalog-content\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.726245 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chx5c\" (UniqueName: \"kubernetes.io/projected/9edd6b96-420e-4333-806b-efefaeb1fe95-kube-api-access-chx5c\") pod \"community-operators-nzggh\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:35 crc kubenswrapper[4672]: I0930 13:58:35.777623 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:36 crc kubenswrapper[4672]: I0930 13:58:36.503190 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzggh"] Sep 30 13:58:36 crc kubenswrapper[4672]: I0930 13:58:36.880719 4672 generic.go:334] "Generic (PLEG): container finished" podID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerID="79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa" exitCode=0 Sep 30 13:58:36 crc kubenswrapper[4672]: I0930 13:58:36.881141 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzggh" event={"ID":"9edd6b96-420e-4333-806b-efefaeb1fe95","Type":"ContainerDied","Data":"79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa"} Sep 30 13:58:36 crc kubenswrapper[4672]: I0930 13:58:36.881172 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzggh" event={"ID":"9edd6b96-420e-4333-806b-efefaeb1fe95","Type":"ContainerStarted","Data":"97795b42cdff91fcf843763a79fa243851ca5f9a39a8e978841a3294a27bed3c"} Sep 30 13:58:37 crc kubenswrapper[4672]: I0930 13:58:37.894668 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzggh" event={"ID":"9edd6b96-420e-4333-806b-efefaeb1fe95","Type":"ContainerStarted","Data":"7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a"} Sep 30 13:58:38 crc kubenswrapper[4672]: I0930 13:58:38.906496 4672 generic.go:334] "Generic (PLEG): container finished" podID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerID="7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a" exitCode=0 Sep 30 13:58:38 crc kubenswrapper[4672]: I0930 13:58:38.906582 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzggh" event={"ID":"9edd6b96-420e-4333-806b-efefaeb1fe95","Type":"ContainerDied","Data":"7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a"} Sep 30 13:58:39 crc kubenswrapper[4672]: I0930 13:58:39.918916 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzggh" event={"ID":"9edd6b96-420e-4333-806b-efefaeb1fe95","Type":"ContainerStarted","Data":"a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef"} Sep 30 13:58:39 crc kubenswrapper[4672]: I0930 13:58:39.944844 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nzggh" podStartSLOduration=2.465274795 podStartE2EDuration="4.944826492s" podCreationTimestamp="2025-09-30 13:58:35 +0000 UTC" firstStartedPulling="2025-09-30 13:58:36.882540976 +0000 UTC m=+5808.151778622" lastFinishedPulling="2025-09-30 13:58:39.362092673 +0000 UTC m=+5810.631330319" observedRunningTime="2025-09-30 13:58:39.935762161 +0000 UTC m=+5811.204999827" watchObservedRunningTime="2025-09-30 13:58:39.944826492 +0000 UTC m=+5811.214064138" Sep 30 13:58:45 crc kubenswrapper[4672]: I0930 13:58:45.778102 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:45 crc kubenswrapper[4672]: I0930 13:58:45.779953 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:45 crc kubenswrapper[4672]: I0930 13:58:45.829688 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.060883 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.250957 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ngjpv"] Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.253964 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.277352 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngjpv"] Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.417331 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-catalog-content\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.417446 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/7492b910-4eb4-4e55-9c32-5d5a69513dc0-kube-api-access-nt88x\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.417668 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-utilities\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.520302 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-utilities\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.520809 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-catalog-content\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.520920 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-utilities\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.520959 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/7492b910-4eb4-4e55-9c32-5d5a69513dc0-kube-api-access-nt88x\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.521439 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-catalog-content\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.550013 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/7492b910-4eb4-4e55-9c32-5d5a69513dc0-kube-api-access-nt88x\") pod \"redhat-marketplace-ngjpv\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:46 crc kubenswrapper[4672]: I0930 13:58:46.576834 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:47 crc kubenswrapper[4672]: I0930 13:58:47.061455 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngjpv"] Sep 30 13:58:47 crc kubenswrapper[4672]: I0930 13:58:47.242471 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzggh"] Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.035022 4672 generic.go:334] "Generic (PLEG): container finished" podID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerID="46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae" exitCode=0 Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.035089 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngjpv" event={"ID":"7492b910-4eb4-4e55-9c32-5d5a69513dc0","Type":"ContainerDied","Data":"46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae"} Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.035465 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngjpv" event={"ID":"7492b910-4eb4-4e55-9c32-5d5a69513dc0","Type":"ContainerStarted","Data":"4cd7ddbd09300ac39026107b70572e49e4e5247ca8b4f6c3fcb47fe2d58bee6a"} Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.035853 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nzggh" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="registry-server" containerID="cri-o://a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef" gracePeriod=2 Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.644905 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.781948 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-catalog-content\") pod \"9edd6b96-420e-4333-806b-efefaeb1fe95\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.782006 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chx5c\" (UniqueName: \"kubernetes.io/projected/9edd6b96-420e-4333-806b-efefaeb1fe95-kube-api-access-chx5c\") pod \"9edd6b96-420e-4333-806b-efefaeb1fe95\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.782281 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-utilities\") pod \"9edd6b96-420e-4333-806b-efefaeb1fe95\" (UID: \"9edd6b96-420e-4333-806b-efefaeb1fe95\") " Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.783661 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-utilities" (OuterVolumeSpecName: "utilities") pod "9edd6b96-420e-4333-806b-efefaeb1fe95" (UID: "9edd6b96-420e-4333-806b-efefaeb1fe95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.789576 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edd6b96-420e-4333-806b-efefaeb1fe95-kube-api-access-chx5c" (OuterVolumeSpecName: "kube-api-access-chx5c") pod "9edd6b96-420e-4333-806b-efefaeb1fe95" (UID: "9edd6b96-420e-4333-806b-efefaeb1fe95"). InnerVolumeSpecName "kube-api-access-chx5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.836355 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9edd6b96-420e-4333-806b-efefaeb1fe95" (UID: "9edd6b96-420e-4333-806b-efefaeb1fe95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.884887 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.884936 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edd6b96-420e-4333-806b-efefaeb1fe95-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:48 crc kubenswrapper[4672]: I0930 13:58:48.884947 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chx5c\" (UniqueName: \"kubernetes.io/projected/9edd6b96-420e-4333-806b-efefaeb1fe95-kube-api-access-chx5c\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.049813 4672 generic.go:334] "Generic (PLEG): container finished" podID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerID="a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef" exitCode=0 Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.049865 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzggh" event={"ID":"9edd6b96-420e-4333-806b-efefaeb1fe95","Type":"ContainerDied","Data":"a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef"} Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.049899 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzggh" event={"ID":"9edd6b96-420e-4333-806b-efefaeb1fe95","Type":"ContainerDied","Data":"97795b42cdff91fcf843763a79fa243851ca5f9a39a8e978841a3294a27bed3c"} Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.049919 4672 scope.go:117] "RemoveContainer" containerID="a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.050079 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzggh" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.091207 4672 scope.go:117] "RemoveContainer" containerID="7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.095297 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzggh"] Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.103473 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nzggh"] Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.138020 4672 scope.go:117] "RemoveContainer" containerID="79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.177474 4672 scope.go:117] "RemoveContainer" containerID="a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef" Sep 30 13:58:49 crc kubenswrapper[4672]: E0930 13:58:49.178671 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef\": container with ID starting with a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef not found: ID does not exist" containerID="a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.178713 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef"} err="failed to get container status \"a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef\": rpc error: code = NotFound desc = could not find container \"a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef\": container with ID starting with a9310d7a387902b25e32171ace3f1c58e668e778adecbf8b69920dd85cd3fbef not found: ID does not exist" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.178739 4672 scope.go:117] "RemoveContainer" containerID="7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a" Sep 30 13:58:49 crc kubenswrapper[4672]: E0930 13:58:49.179228 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a\": container with ID starting with 7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a not found: ID does not exist" containerID="7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.179316 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a"} err="failed to get container status \"7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a\": rpc error: code = NotFound desc = could not find container \"7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a\": container with ID starting with 7fb3de26370cfb0665adc0c43d218897ed4380a63af976b7f7ac7e62d515561a not found: ID does not exist" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.179353 4672 scope.go:117] "RemoveContainer" containerID="79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa" Sep 30 13:58:49 crc kubenswrapper[4672]: E0930 13:58:49.179823 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa\": container with ID starting with 79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa not found: ID does not exist" containerID="79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.179877 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa"} err="failed to get container status \"79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa\": rpc error: code = NotFound desc = could not find container \"79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa\": container with ID starting with 79d6f4b632594593ed1bfc8b765a1ca10ece67cf4b3756071b4d1e529002d7aa not found: ID does not exist" Sep 30 13:58:49 crc kubenswrapper[4672]: I0930 13:58:49.455386 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" path="/var/lib/kubelet/pods/9edd6b96-420e-4333-806b-efefaeb1fe95/volumes" Sep 30 13:58:50 crc kubenswrapper[4672]: I0930 13:58:50.064312 4672 generic.go:334] "Generic (PLEG): container finished" podID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerID="9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3" exitCode=0 Sep 30 13:58:50 crc kubenswrapper[4672]: I0930 13:58:50.064496 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngjpv" event={"ID":"7492b910-4eb4-4e55-9c32-5d5a69513dc0","Type":"ContainerDied","Data":"9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3"} Sep 30 13:58:51 crc kubenswrapper[4672]: I0930 13:58:51.077311 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngjpv" event={"ID":"7492b910-4eb4-4e55-9c32-5d5a69513dc0","Type":"ContainerStarted","Data":"d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0"} Sep 30 13:58:51 crc kubenswrapper[4672]: I0930 13:58:51.102487 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ngjpv" podStartSLOduration=2.502878987 podStartE2EDuration="5.102470971s" podCreationTimestamp="2025-09-30 13:58:46 +0000 UTC" firstStartedPulling="2025-09-30 13:58:48.037882587 +0000 UTC m=+5819.307120243" lastFinishedPulling="2025-09-30 13:58:50.637474571 +0000 UTC m=+5821.906712227" observedRunningTime="2025-09-30 13:58:51.094394015 +0000 UTC m=+5822.363631671" watchObservedRunningTime="2025-09-30 13:58:51.102470971 +0000 UTC m=+5822.371708617" Sep 30 13:58:54 crc kubenswrapper[4672]: E0930 13:58:54.706970 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7492b910_4eb4_4e55_9c32_5d5a69513dc0.slice/crio-9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:58:56 crc kubenswrapper[4672]: I0930 13:58:56.577929 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:56 crc kubenswrapper[4672]: I0930 13:58:56.578345 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:56 crc kubenswrapper[4672]: I0930 13:58:56.629537 4672 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:57 crc kubenswrapper[4672]: I0930 13:58:57.179371 4672 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:57 crc kubenswrapper[4672]: I0930 13:58:57.222684 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngjpv"] Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.150223 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ngjpv" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="registry-server" containerID="cri-o://d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0" gracePeriod=2 Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.627638 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.812383 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/7492b910-4eb4-4e55-9c32-5d5a69513dc0-kube-api-access-nt88x\") pod \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.812915 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-utilities\") pod \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.813034 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-catalog-content\") pod \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\" (UID: \"7492b910-4eb4-4e55-9c32-5d5a69513dc0\") " Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.813825 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-utilities" (OuterVolumeSpecName: "utilities") pod "7492b910-4eb4-4e55-9c32-5d5a69513dc0" (UID: "7492b910-4eb4-4e55-9c32-5d5a69513dc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.819447 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7492b910-4eb4-4e55-9c32-5d5a69513dc0-kube-api-access-nt88x" (OuterVolumeSpecName: "kube-api-access-nt88x") pod "7492b910-4eb4-4e55-9c32-5d5a69513dc0" (UID: "7492b910-4eb4-4e55-9c32-5d5a69513dc0"). InnerVolumeSpecName "kube-api-access-nt88x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.829657 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7492b910-4eb4-4e55-9c32-5d5a69513dc0" (UID: "7492b910-4eb4-4e55-9c32-5d5a69513dc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.915201 4672 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.915309 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt88x\" (UniqueName: \"kubernetes.io/projected/7492b910-4eb4-4e55-9c32-5d5a69513dc0-kube-api-access-nt88x\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:59 crc kubenswrapper[4672]: I0930 13:58:59.915324 4672 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7492b910-4eb4-4e55-9c32-5d5a69513dc0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.164221 4672 generic.go:334] "Generic (PLEG): container finished" podID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerID="d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0" exitCode=0 Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.164295 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngjpv" event={"ID":"7492b910-4eb4-4e55-9c32-5d5a69513dc0","Type":"ContainerDied","Data":"d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0"} Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.164338 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngjpv" event={"ID":"7492b910-4eb4-4e55-9c32-5d5a69513dc0","Type":"ContainerDied","Data":"4cd7ddbd09300ac39026107b70572e49e4e5247ca8b4f6c3fcb47fe2d58bee6a"} Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.164365 4672 scope.go:117] "RemoveContainer" containerID="d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.164341 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngjpv" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.194790 4672 scope.go:117] "RemoveContainer" containerID="9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.208387 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngjpv"] Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.220581 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngjpv"] Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.229892 4672 scope.go:117] "RemoveContainer" containerID="46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.267475 4672 scope.go:117] "RemoveContainer" containerID="d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0" Sep 30 13:59:00 crc kubenswrapper[4672]: E0930 13:59:00.268183 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0\": container with ID starting with d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0 not found: ID does not exist" containerID="d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.268224 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0"} err="failed to get container status \"d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0\": rpc error: code = NotFound desc = could not find container \"d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0\": container with ID starting with d79c34c299d082657028b405ceeb90271c48bb752e7e002186afd5d6699538e0 not found: ID does not exist" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.268248 4672 scope.go:117] "RemoveContainer" containerID="9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3" Sep 30 13:59:00 crc kubenswrapper[4672]: E0930 13:59:00.268758 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3\": container with ID starting with 9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3 not found: ID does not exist" containerID="9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.268804 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3"} err="failed to get container status \"9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3\": rpc error: code = NotFound desc = could not find container \"9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3\": container with ID starting with 9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3 not found: ID does not exist" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.268830 4672 scope.go:117] "RemoveContainer" containerID="46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae" Sep 30 13:59:00 crc kubenswrapper[4672]: E0930 13:59:00.269347 4672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae\": container with ID starting with 46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae not found: ID does not exist" containerID="46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae" Sep 30 13:59:00 crc kubenswrapper[4672]: I0930 13:59:00.269378 4672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae"} err="failed to get container status \"46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae\": rpc error: code = NotFound desc = could not find container \"46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae\": container with ID starting with 46f011ea021a93c4649ee983f08ad119602baa2c5862b9ab848e350832eb65ae not found: ID does not exist" Sep 30 13:59:01 crc kubenswrapper[4672]: I0930 13:59:01.431406 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" path="/var/lib/kubelet/pods/7492b910-4eb4-4e55-9c32-5d5a69513dc0/volumes" Sep 30 13:59:04 crc kubenswrapper[4672]: E0930 13:59:04.966859 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7492b910_4eb4_4e55_9c32_5d5a69513dc0.slice/crio-9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:59:15 crc kubenswrapper[4672]: E0930 13:59:15.268844 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7492b910_4eb4_4e55_9c32_5d5a69513dc0.slice/crio-9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:59:25 crc kubenswrapper[4672]: E0930 13:59:25.548978 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7492b910_4eb4_4e55_9c32_5d5a69513dc0.slice/crio-9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:59:35 crc kubenswrapper[4672]: E0930 13:59:35.828752 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7492b910_4eb4_4e55_9c32_5d5a69513dc0.slice/crio-9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3.scope\": RecentStats: unable to find data in memory cache]" Sep 30 13:59:46 crc kubenswrapper[4672]: E0930 13:59:46.089913 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7492b910_4eb4_4e55_9c32_5d5a69513dc0.slice/crio-9075cfebe76a889a3e84ddc190aea77d6de4257402b8357bf21166434a505df3.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.154842 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm"] Sep 30 14:00:00 crc kubenswrapper[4672]: E0930 14:00:00.155858 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.155883 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4672]: E0930 14:00:00.155902 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.155908 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4672]: E0930 14:00:00.155926 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.155934 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="extract-utilities" Sep 30 14:00:00 crc kubenswrapper[4672]: E0930 14:00:00.155950 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.155957 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4672]: E0930 14:00:00.155969 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.155975 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="extract-content" Sep 30 14:00:00 crc kubenswrapper[4672]: E0930 14:00:00.155988 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.155994 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.156161 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edd6b96-420e-4333-806b-efefaeb1fe95" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.156205 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="7492b910-4eb4-4e55-9c32-5d5a69513dc0" containerName="registry-server" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.156915 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.165210 4672 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.165318 4672 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.170333 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm"] Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.215851 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db4228a8-a6f8-41c4-834b-03ed5d257543-secret-volume\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.215944 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mndg7\" (UniqueName: \"kubernetes.io/projected/db4228a8-a6f8-41c4-834b-03ed5d257543-kube-api-access-mndg7\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.216344 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db4228a8-a6f8-41c4-834b-03ed5d257543-config-volume\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.318688 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db4228a8-a6f8-41c4-834b-03ed5d257543-config-volume\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.318886 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db4228a8-a6f8-41c4-834b-03ed5d257543-secret-volume\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.318950 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mndg7\" (UniqueName: \"kubernetes.io/projected/db4228a8-a6f8-41c4-834b-03ed5d257543-kube-api-access-mndg7\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.319768 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db4228a8-a6f8-41c4-834b-03ed5d257543-config-volume\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.324799 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db4228a8-a6f8-41c4-834b-03ed5d257543-secret-volume\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.338689 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mndg7\" (UniqueName: \"kubernetes.io/projected/db4228a8-a6f8-41c4-834b-03ed5d257543-kube-api-access-mndg7\") pod \"collect-profiles-29320680-xd4tm\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:00 crc kubenswrapper[4672]: I0930 14:00:00.478497 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:01 crc kubenswrapper[4672]: I0930 14:00:01.029257 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm"] Sep 30 14:00:01 crc kubenswrapper[4672]: I0930 14:00:01.810308 4672 generic.go:334] "Generic (PLEG): container finished" podID="db4228a8-a6f8-41c4-834b-03ed5d257543" containerID="3b9319b756777e5c5c48cb7fe6c362fb8fe1c2b7a08ab9b4a9dfa25037c6c587" exitCode=0 Sep 30 14:00:01 crc kubenswrapper[4672]: I0930 14:00:01.810847 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" event={"ID":"db4228a8-a6f8-41c4-834b-03ed5d257543","Type":"ContainerDied","Data":"3b9319b756777e5c5c48cb7fe6c362fb8fe1c2b7a08ab9b4a9dfa25037c6c587"} Sep 30 14:00:01 crc kubenswrapper[4672]: I0930 14:00:01.810912 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" event={"ID":"db4228a8-a6f8-41c4-834b-03ed5d257543","Type":"ContainerStarted","Data":"8f02f2b3896aa43cee8d9b1851f9722d7cdfa3714a9f6b4d903de9824d217bc0"} Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.232658 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.321392 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db4228a8-a6f8-41c4-834b-03ed5d257543-config-volume\") pod \"db4228a8-a6f8-41c4-834b-03ed5d257543\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.321854 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mndg7\" (UniqueName: \"kubernetes.io/projected/db4228a8-a6f8-41c4-834b-03ed5d257543-kube-api-access-mndg7\") pod \"db4228a8-a6f8-41c4-834b-03ed5d257543\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.322011 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db4228a8-a6f8-41c4-834b-03ed5d257543-secret-volume\") pod \"db4228a8-a6f8-41c4-834b-03ed5d257543\" (UID: \"db4228a8-a6f8-41c4-834b-03ed5d257543\") " Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.322689 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db4228a8-a6f8-41c4-834b-03ed5d257543-config-volume" (OuterVolumeSpecName: "config-volume") pod "db4228a8-a6f8-41c4-834b-03ed5d257543" (UID: "db4228a8-a6f8-41c4-834b-03ed5d257543"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.329182 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4228a8-a6f8-41c4-834b-03ed5d257543-kube-api-access-mndg7" (OuterVolumeSpecName: "kube-api-access-mndg7") pod "db4228a8-a6f8-41c4-834b-03ed5d257543" (UID: "db4228a8-a6f8-41c4-834b-03ed5d257543"). InnerVolumeSpecName "kube-api-access-mndg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.329216 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4228a8-a6f8-41c4-834b-03ed5d257543-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "db4228a8-a6f8-41c4-834b-03ed5d257543" (UID: "db4228a8-a6f8-41c4-834b-03ed5d257543"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.427756 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mndg7\" (UniqueName: \"kubernetes.io/projected/db4228a8-a6f8-41c4-834b-03ed5d257543-kube-api-access-mndg7\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.427790 4672 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db4228a8-a6f8-41c4-834b-03ed5d257543-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.427799 4672 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db4228a8-a6f8-41c4-834b-03ed5d257543-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.828673 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" event={"ID":"db4228a8-a6f8-41c4-834b-03ed5d257543","Type":"ContainerDied","Data":"8f02f2b3896aa43cee8d9b1851f9722d7cdfa3714a9f6b4d903de9824d217bc0"} Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.828713 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f02f2b3896aa43cee8d9b1851f9722d7cdfa3714a9f6b4d903de9824d217bc0" Sep 30 14:00:03 crc kubenswrapper[4672]: I0930 14:00:03.828728 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-xd4tm" Sep 30 14:00:04 crc kubenswrapper[4672]: I0930 14:00:04.309237 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc"] Sep 30 14:00:04 crc kubenswrapper[4672]: I0930 14:00:04.319838 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320635-2hctc"] Sep 30 14:00:05 crc kubenswrapper[4672]: I0930 14:00:05.427715 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3926b3-5caf-4f28-90c4-6f0f00210b19" path="/var/lib/kubelet/pods/ca3926b3-5caf-4f28-90c4-6f0f00210b19/volumes" Sep 30 14:00:24 crc kubenswrapper[4672]: I0930 14:00:24.739707 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:00:24 crc kubenswrapper[4672]: I0930 14:00:24.740331 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:00:27 crc kubenswrapper[4672]: I0930 14:00:27.076438 4672 generic.go:334] "Generic (PLEG): container finished" podID="23d735d5-5887-469a-a9a7-973647e56895" containerID="ea28a837405ac8035545c0a1cdf20d8a6d61a0b2f996f21de521780e5a8fa3cb" exitCode=0 Sep 30 14:00:27 crc kubenswrapper[4672]: I0930 14:00:27.076533 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" event={"ID":"23d735d5-5887-469a-a9a7-973647e56895","Type":"ContainerDied","Data":"ea28a837405ac8035545c0a1cdf20d8a6d61a0b2f996f21de521780e5a8fa3cb"} Sep 30 14:00:27 crc kubenswrapper[4672]: I0930 14:00:27.077475 4672 scope.go:117] "RemoveContainer" containerID="ea28a837405ac8035545c0a1cdf20d8a6d61a0b2f996f21de521780e5a8fa3cb" Sep 30 14:00:27 crc kubenswrapper[4672]: E0930 14:00:27.174675 4672 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23d735d5_5887_469a_a9a7_973647e56895.slice/crio-conmon-ea28a837405ac8035545c0a1cdf20d8a6d61a0b2f996f21de521780e5a8fa3cb.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:00:28 crc kubenswrapper[4672]: I0930 14:00:28.036964 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9cj4_must-gather-vcd5s_23d735d5-5887-469a-a9a7-973647e56895/gather/0.log" Sep 30 14:00:41 crc kubenswrapper[4672]: I0930 14:00:41.915695 4672 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b9cj4/must-gather-vcd5s"] Sep 30 14:00:41 crc kubenswrapper[4672]: I0930 14:00:41.916798 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" podUID="23d735d5-5887-469a-a9a7-973647e56895" containerName="copy" containerID="cri-o://21fa94d1b1be4488474e38c75cb7d9b08aa62a6bf64addce0e81cf0b0cf7adc8" gracePeriod=2 Sep 30 14:00:41 crc kubenswrapper[4672]: I0930 14:00:41.927391 4672 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b9cj4/must-gather-vcd5s"] Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.265552 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9cj4_must-gather-vcd5s_23d735d5-5887-469a-a9a7-973647e56895/copy/0.log" Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.267625 4672 generic.go:334] "Generic (PLEG): container finished" podID="23d735d5-5887-469a-a9a7-973647e56895" containerID="21fa94d1b1be4488474e38c75cb7d9b08aa62a6bf64addce0e81cf0b0cf7adc8" exitCode=143 Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.478615 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9cj4_must-gather-vcd5s_23d735d5-5887-469a-a9a7-973647e56895/copy/0.log" Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.479470 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.672556 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23d735d5-5887-469a-a9a7-973647e56895-must-gather-output\") pod \"23d735d5-5887-469a-a9a7-973647e56895\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.672743 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76sj4\" (UniqueName: \"kubernetes.io/projected/23d735d5-5887-469a-a9a7-973647e56895-kube-api-access-76sj4\") pod \"23d735d5-5887-469a-a9a7-973647e56895\" (UID: \"23d735d5-5887-469a-a9a7-973647e56895\") " Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.678462 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d735d5-5887-469a-a9a7-973647e56895-kube-api-access-76sj4" (OuterVolumeSpecName: "kube-api-access-76sj4") pod "23d735d5-5887-469a-a9a7-973647e56895" (UID: "23d735d5-5887-469a-a9a7-973647e56895"). InnerVolumeSpecName "kube-api-access-76sj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.774665 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76sj4\" (UniqueName: \"kubernetes.io/projected/23d735d5-5887-469a-a9a7-973647e56895-kube-api-access-76sj4\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.869110 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d735d5-5887-469a-a9a7-973647e56895-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "23d735d5-5887-469a-a9a7-973647e56895" (UID: "23d735d5-5887-469a-a9a7-973647e56895"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:00:42 crc kubenswrapper[4672]: I0930 14:00:42.876155 4672 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23d735d5-5887-469a-a9a7-973647e56895-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:43 crc kubenswrapper[4672]: I0930 14:00:43.278447 4672 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b9cj4_must-gather-vcd5s_23d735d5-5887-469a-a9a7-973647e56895/copy/0.log" Sep 30 14:00:43 crc kubenswrapper[4672]: I0930 14:00:43.278855 4672 scope.go:117] "RemoveContainer" containerID="21fa94d1b1be4488474e38c75cb7d9b08aa62a6bf64addce0e81cf0b0cf7adc8" Sep 30 14:00:43 crc kubenswrapper[4672]: I0930 14:00:43.279009 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b9cj4/must-gather-vcd5s" Sep 30 14:00:43 crc kubenswrapper[4672]: I0930 14:00:43.301046 4672 scope.go:117] "RemoveContainer" containerID="ea28a837405ac8035545c0a1cdf20d8a6d61a0b2f996f21de521780e5a8fa3cb" Sep 30 14:00:43 crc kubenswrapper[4672]: I0930 14:00:43.432974 4672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d735d5-5887-469a-a9a7-973647e56895" path="/var/lib/kubelet/pods/23d735d5-5887-469a-a9a7-973647e56895/volumes" Sep 30 14:00:54 crc kubenswrapper[4672]: I0930 14:00:54.739546 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:00:54 crc kubenswrapper[4672]: I0930 14:00:54.740163 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.165226 4672 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320681-tdmfr"] Sep 30 14:01:00 crc kubenswrapper[4672]: E0930 14:01:00.166211 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4228a8-a6f8-41c4-834b-03ed5d257543" containerName="collect-profiles" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.166233 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4228a8-a6f8-41c4-834b-03ed5d257543" containerName="collect-profiles" Sep 30 14:01:00 crc kubenswrapper[4672]: E0930 14:01:00.166268 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d735d5-5887-469a-a9a7-973647e56895" containerName="gather" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.166294 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d735d5-5887-469a-a9a7-973647e56895" containerName="gather" Sep 30 14:01:00 crc kubenswrapper[4672]: E0930 14:01:00.166306 4672 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d735d5-5887-469a-a9a7-973647e56895" containerName="copy" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.166316 4672 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d735d5-5887-469a-a9a7-973647e56895" containerName="copy" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.166553 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d735d5-5887-469a-a9a7-973647e56895" containerName="gather" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.166577 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d735d5-5887-469a-a9a7-973647e56895" containerName="copy" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.166594 4672 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4228a8-a6f8-41c4-834b-03ed5d257543" containerName="collect-profiles" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.167264 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.181062 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320681-tdmfr"] Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.309172 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-combined-ca-bundle\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.309343 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkv8c\" (UniqueName: \"kubernetes.io/projected/240f0f28-3ccd-4baf-8171-9778e15b036d-kube-api-access-jkv8c\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.309409 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-config-data\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.309481 4672 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-fernet-keys\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.410897 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkv8c\" (UniqueName: \"kubernetes.io/projected/240f0f28-3ccd-4baf-8171-9778e15b036d-kube-api-access-jkv8c\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.411014 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-config-data\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.411033 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-fernet-keys\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.411973 4672 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-combined-ca-bundle\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.417232 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-config-data\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.418140 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-combined-ca-bundle\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.423004 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-fernet-keys\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.458979 4672 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkv8c\" (UniqueName: \"kubernetes.io/projected/240f0f28-3ccd-4baf-8171-9778e15b036d-kube-api-access-jkv8c\") pod \"keystone-cron-29320681-tdmfr\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:00 crc kubenswrapper[4672]: I0930 14:01:00.488682 4672 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:01 crc kubenswrapper[4672]: I0930 14:01:01.014365 4672 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320681-tdmfr"] Sep 30 14:01:01 crc kubenswrapper[4672]: I0930 14:01:01.456991 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-tdmfr" event={"ID":"240f0f28-3ccd-4baf-8171-9778e15b036d","Type":"ContainerStarted","Data":"038d747dd7e4522ca7d8647c4652e4829b1c46afb84eedbf82a4354b45d99213"} Sep 30 14:01:01 crc kubenswrapper[4672]: I0930 14:01:01.457292 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-tdmfr" event={"ID":"240f0f28-3ccd-4baf-8171-9778e15b036d","Type":"ContainerStarted","Data":"e96652f814998ad705fd51ab207b32804cf92b13bfd3a8f7ddd5c7e15c91c58a"} Sep 30 14:01:01 crc kubenswrapper[4672]: I0930 14:01:01.473740 4672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320681-tdmfr" podStartSLOduration=1.473722494 podStartE2EDuration="1.473722494s" podCreationTimestamp="2025-09-30 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:01:01.472152214 +0000 UTC m=+5952.741389900" watchObservedRunningTime="2025-09-30 14:01:01.473722494 +0000 UTC m=+5952.742960140" Sep 30 14:01:03 crc kubenswrapper[4672]: I0930 14:01:03.660536 4672 scope.go:117] "RemoveContainer" containerID="9dad776a5c2f5b8a01afd8d87ed1e1d9d22f35d0eb853ff6adb4c92b7541abcd" Sep 30 14:01:05 crc kubenswrapper[4672]: I0930 14:01:05.494060 4672 generic.go:334] "Generic (PLEG): container finished" podID="240f0f28-3ccd-4baf-8171-9778e15b036d" containerID="038d747dd7e4522ca7d8647c4652e4829b1c46afb84eedbf82a4354b45d99213" exitCode=0 Sep 30 14:01:05 crc kubenswrapper[4672]: I0930 14:01:05.494437 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-tdmfr" event={"ID":"240f0f28-3ccd-4baf-8171-9778e15b036d","Type":"ContainerDied","Data":"038d747dd7e4522ca7d8647c4652e4829b1c46afb84eedbf82a4354b45d99213"} Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.853798 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.944199 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-combined-ca-bundle\") pod \"240f0f28-3ccd-4baf-8171-9778e15b036d\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.944537 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-fernet-keys\") pod \"240f0f28-3ccd-4baf-8171-9778e15b036d\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.944680 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkv8c\" (UniqueName: \"kubernetes.io/projected/240f0f28-3ccd-4baf-8171-9778e15b036d-kube-api-access-jkv8c\") pod \"240f0f28-3ccd-4baf-8171-9778e15b036d\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.944799 4672 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-config-data\") pod \"240f0f28-3ccd-4baf-8171-9778e15b036d\" (UID: \"240f0f28-3ccd-4baf-8171-9778e15b036d\") " Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.951302 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "240f0f28-3ccd-4baf-8171-9778e15b036d" (UID: "240f0f28-3ccd-4baf-8171-9778e15b036d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.952678 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240f0f28-3ccd-4baf-8171-9778e15b036d-kube-api-access-jkv8c" (OuterVolumeSpecName: "kube-api-access-jkv8c") pod "240f0f28-3ccd-4baf-8171-9778e15b036d" (UID: "240f0f28-3ccd-4baf-8171-9778e15b036d"). InnerVolumeSpecName "kube-api-access-jkv8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.985056 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "240f0f28-3ccd-4baf-8171-9778e15b036d" (UID: "240f0f28-3ccd-4baf-8171-9778e15b036d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:06 crc kubenswrapper[4672]: I0930 14:01:06.995889 4672 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-config-data" (OuterVolumeSpecName: "config-data") pod "240f0f28-3ccd-4baf-8171-9778e15b036d" (UID: "240f0f28-3ccd-4baf-8171-9778e15b036d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:01:07 crc kubenswrapper[4672]: I0930 14:01:07.048574 4672 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:07 crc kubenswrapper[4672]: I0930 14:01:07.048847 4672 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:07 crc kubenswrapper[4672]: I0930 14:01:07.048938 4672 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/240f0f28-3ccd-4baf-8171-9778e15b036d-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:07 crc kubenswrapper[4672]: I0930 14:01:07.049024 4672 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkv8c\" (UniqueName: \"kubernetes.io/projected/240f0f28-3ccd-4baf-8171-9778e15b036d-kube-api-access-jkv8c\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:07 crc kubenswrapper[4672]: I0930 14:01:07.513908 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320681-tdmfr" event={"ID":"240f0f28-3ccd-4baf-8171-9778e15b036d","Type":"ContainerDied","Data":"e96652f814998ad705fd51ab207b32804cf92b13bfd3a8f7ddd5c7e15c91c58a"} Sep 30 14:01:07 crc kubenswrapper[4672]: I0930 14:01:07.514382 4672 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e96652f814998ad705fd51ab207b32804cf92b13bfd3a8f7ddd5c7e15c91c58a" Sep 30 14:01:07 crc kubenswrapper[4672]: I0930 14:01:07.514005 4672 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320681-tdmfr" Sep 30 14:01:24 crc kubenswrapper[4672]: I0930 14:01:24.739366 4672 patch_prober.go:28] interesting pod/machine-config-daemon-dpqrd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:01:24 crc kubenswrapper[4672]: I0930 14:01:24.739925 4672 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:01:24 crc kubenswrapper[4672]: I0930 14:01:24.739978 4672 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" Sep 30 14:01:24 crc kubenswrapper[4672]: I0930 14:01:24.740803 4672 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d"} pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:01:24 crc kubenswrapper[4672]: I0930 14:01:24.740865 4672 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" containerName="machine-config-daemon" containerID="cri-o://c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" gracePeriod=600 Sep 30 14:01:24 crc kubenswrapper[4672]: E0930 14:01:24.945256 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 14:01:25 crc kubenswrapper[4672]: I0930 14:01:25.723881 4672 generic.go:334] "Generic (PLEG): container finished" podID="95794952-d817-48f2-8956-f7a310f8d1d9" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" exitCode=0 Sep 30 14:01:25 crc kubenswrapper[4672]: I0930 14:01:25.723985 4672 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" event={"ID":"95794952-d817-48f2-8956-f7a310f8d1d9","Type":"ContainerDied","Data":"c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d"} Sep 30 14:01:25 crc kubenswrapper[4672]: I0930 14:01:25.724304 4672 scope.go:117] "RemoveContainer" containerID="966f5125052c74b4e7c550103d16e6fa385f49fb5552ce6b0019712558a5916e" Sep 30 14:01:25 crc kubenswrapper[4672]: I0930 14:01:25.725008 4672 scope.go:117] "RemoveContainer" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" Sep 30 14:01:25 crc kubenswrapper[4672]: E0930 14:01:25.725326 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 14:01:39 crc kubenswrapper[4672]: I0930 14:01:39.423677 4672 scope.go:117] "RemoveContainer" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" Sep 30 14:01:39 crc kubenswrapper[4672]: E0930 14:01:39.424514 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 14:01:53 crc kubenswrapper[4672]: I0930 14:01:53.417248 4672 scope.go:117] "RemoveContainer" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" Sep 30 14:01:53 crc kubenswrapper[4672]: E0930 14:01:53.418161 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 14:02:03 crc kubenswrapper[4672]: I0930 14:02:03.723173 4672 scope.go:117] "RemoveContainer" containerID="9fbaed5e0cd7a4f547e6f7ed9f6d4a00e978537ac6f5891193b4b28c828b7c1e" Sep 30 14:02:03 crc kubenswrapper[4672]: I0930 14:02:03.748094 4672 scope.go:117] "RemoveContainer" containerID="75825a39fcf305ff309bcf2868873db3034d5f3a01cea1af7bc62b2e40406560" Sep 30 14:02:03 crc kubenswrapper[4672]: I0930 14:02:03.800999 4672 scope.go:117] "RemoveContainer" containerID="05278a6ce0cacb6db6f6f53f98ff99dc995d43edfd9ad820a2b2a983bfea0df5" Sep 30 14:02:04 crc kubenswrapper[4672]: I0930 14:02:04.417396 4672 scope.go:117] "RemoveContainer" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" Sep 30 14:02:04 crc kubenswrapper[4672]: E0930 14:02:04.417953 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 14:02:15 crc kubenswrapper[4672]: I0930 14:02:15.417765 4672 scope.go:117] "RemoveContainer" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" Sep 30 14:02:15 crc kubenswrapper[4672]: E0930 14:02:15.419467 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 14:02:28 crc kubenswrapper[4672]: I0930 14:02:28.417580 4672 scope.go:117] "RemoveContainer" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" Sep 30 14:02:28 crc kubenswrapper[4672]: E0930 14:02:28.418219 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9" Sep 30 14:02:42 crc kubenswrapper[4672]: I0930 14:02:42.416790 4672 scope.go:117] "RemoveContainer" containerID="c5c9370020a0ccb145ea0d983d80cc8ce5cbf67232141de015f21c882d0da81d" Sep 30 14:02:42 crc kubenswrapper[4672]: E0930 14:02:42.417607 4672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dpqrd_openshift-machine-config-operator(95794952-d817-48f2-8956-f7a310f8d1d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-dpqrd" podUID="95794952-d817-48f2-8956-f7a310f8d1d9"